

Other version: http://www.mdpi.com/1099-4300/19/3/122
![]() |
On Hölder Projective Divergences |
|
Authors | ||
Published in | Entropy. 2017, vol. 19, no. 3/122 | |
Abstract | We describe a framework to build distances by measuring the tightness of inequalities and introduce the notion of proper statistical divergences and improper pseudo-divergences. We then consider the Hölder ordinary and reverse inequalities and present two novel classes of Hölder divergences and pseudo-divergences that both encapsulate the special case of the Cauchy–Schwarz divergence. We report closed-form formulas for those statistical dissimilarities when considering distributions belonging to the same exponential family provided that the natural parameter space is a cone (e.g., multivariate Gaussians) or affine (e.g., categorical distributions). Those new classes of Hölder distances are invariant to rescaling and thus do not require distributions to be normalized. Finally, we show how to compute statistical Hölder centroids with respect to those divergences and carry out center-based clustering toy experiments on a set of Gaussian distributions which demonstrate empirically that symmetrized Hölder divergences outperform the symmetric Cauchy–Schwarz divergence. | |
Keywords | Hölder inequalities — Hölder divergences — Projective divergences — Cauchy–Schwarz divergence — Hölder escort divergences — Skew Bhattacharyya divergences — Exponential families — Conic exponential families — Escort distribution — Clustering | |
Identifiers | DOI: 10.3390/e19030122 | |
Full text |
![]() ![]() Other version: http://www.mdpi.com/1099-4300/19/3/122 |
|
Structures | ||
Research groups | Computer Vision and Multimedia Laboratory Viper group | |
Citation (ISO format) | NIELSEN, Frank, SUN, Ke, MARCHAND-MAILLET, Stéphane. On Hölder Projective Divergences. In: Entropy, 2017, vol. 19, n° 3/122. doi: 10.3390/e19030122 https://archive-ouverte.unige.ch/unige:103033 |