Proceedings chapter

Comix: joint estimation and lightspeed comparison of mixture models

Presented at Shanghai (China), 20-25 March 2016
PublisherPiscataway (N.J.) : IEEE
Publication date2016

The Kullback-Leibler divergence is a widespread dissimilarity measure between probability density functions, based on the Shannon entropy. Unfortunately, there is no analytic formula available to compute this divergence between mixture models, imposing the use of costly approximation algorithms. In order to reduce the computational burden when a lot of divergence evaluations are needed, we introduce a sub-class of the mixture models where the component parameters are shared between a set of mixtures and the only degree-of-freedom is the vector of weights of each mixture. This sharing allows to design extremely fast versions of existing dissimilarity measures between mixtures. We demonstrate the effectiveness of our approach by evaluating the quality of the ordering produced by our method on a real dataset.

  • Mixture model
  • Density estimation
  • Information geometry
  • Kullback-Leibler divergence
  • Exponential family
Citation (ISO format)
SCHWANDER, Olivier, MARCHAND-MAILLET, Stéphane, NIELSEN, Frank. Comix: joint estimation and lightspeed comparison of mixture models. In: IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016. Shanghai (China). Piscataway (N.J.) : IEEE, 2016. p. 2449–2453. doi: 10.1109/ICASSP.2016.7472117
Main files (1)
Proceedings chapter (Published version)

Technical informations

Creation09/13/2019 3:42:00 PM
First validation09/13/2019 3:42:00 PM
Update time03/15/2023 6:02:01 PM
Status update03/15/2023 6:02:01 PM
Last indexation01/17/2024 6:16:15 AM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack