An information theoretic analysis of maximum likelihood mixture estimation for exponential families

Arindam Banerjee, Inderjit Dhillon, Joydeep Ghosh, Srujana Merugu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

16 Scopus citations

Abstract

An important task in unsupervised learning is maximum likelihood mixture estimation (MLME) for exponential families. In this paper, we prove a mathematical equivalence between this MLME problem and the rate distortion problem for Bregman divergences. We also present new theoretical results in rate distortion theory for Bregman divergences. Further, an analysis of the problems as a trade-off between compression and preservation of information is presented that yields the information bottleneck method as an interesting special case.

Original languageEnglish (US)
Title of host publicationProceedings, Twenty-First International Conference on Machine Learning, ICML 2004
EditorsR. Greiner, D. Schuurmans
Pages57-64
Number of pages8
StatePublished - Dec 1 2004
EventProceedings, Twenty-First International Conference on Machine Learning, ICML 2004 - Banff, Alta, Canada
Duration: Jul 4 2004Jul 8 2004

Publication series

NameProceedings, Twenty-First International Conference on Machine Learning, ICML 2004

Other

OtherProceedings, Twenty-First International Conference on Machine Learning, ICML 2004
Country/TerritoryCanada
CityBanff, Alta
Period7/4/047/8/04

Fingerprint

Dive into the research topics of 'An information theoretic analysis of maximum likelihood mixture estimation for exponential families'. Together they form a unique fingerprint.

Cite this