Scientific article
Open access

On Hölder Projective Divergences

Published inEntropy, vol. 19, no. 3/122
Publication date2017

We describe a framework to build distances by measuring the tightness of inequalities and introduce the notion of proper statistical divergences and improper pseudo-divergences. We then consider the Hölder ordinary and reverse inequalities and present two novel classes of Hölder divergences and pseudo-divergences that both encapsulate the special case of the Cauchy–Schwarz divergence. We report closed-form formulas for those statistical dissimilarities when considering distributions belonging to the same exponential family provided that the natural parameter space is a cone (e.g., multivariate Gaussians) or affine (e.g., categorical distributions). Those new classes of Hölder distances are invariant to rescaling and thus do not require distributions to be normalized. Finally, we show how to compute statistical Hölder centroids with respect to those divergences and carry out center-based clustering toy experiments on a set of Gaussian distributions which demonstrate empirically that symmetrized Hölder divergences outperform the symmetric Cauchy–Schwarz divergence.

  • Hölder inequalities
  • Hölder divergences
  • Projective divergences
  • Cauchy–Schwarz divergence
  • Hölder escort divergences
  • Skew Bhattacharyya divergences
  • Exponential families
  • Conic exponential families
  • Escort distribution
  • Clustering
Citation (ISO format)
NIELSEN, Frank, SUN, Ke, MARCHAND-MAILLET, Stéphane. On Hölder Projective Divergences. In: Entropy, 2017, vol. 19, n° 3/122. doi: 10.3390/e19030122
Main files (1)
Article (Published version)
ISSN of the journal1099-4300

Technical informations

Creation03/16/2018 5:31:00 PM
First validation03/16/2018 5:31:00 PM
Update time03/15/2023 7:59:17 AM
Status update03/15/2023 7:59:16 AM
Last indexation02/12/2024 12:50:12 PM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack