Latent Dirichlet conditional naive-Bayes models

Arindam Banerjee, Hanhuai Shan

Research output: Chapter in Book/Report/Conference proceedingConference contribution

11 Scopus citations

Abstract

In spite of the popularity of probabilistic mixture models for latent structure discovery from data, mixture models do not have a natural mechanism for handling sparsity, where each data point only has a few non-zero observations. In this paper, we introduce conditional naive-Bayes (CNB) models, which generalize naive-Bayes mixture models to naturally handle sparsity by conditioning the model on observed features. Further, we present latent Dirichlet conditional naive-Bayes (LD-CNB) models, which constitute a family of powerful hierarchical Bayesian models for latent structure discovery from sparse data. The proposed family of models are quite general and can work with arbitrary regular exponential family conditional distributions. We present a variational inference based EM algorithm for learning along with special case analyses for Gaussian and discrete distributions. The efficacy of the proposed models are demonstrated by extensive experiments on a wide variety of different datasets.

Original languageEnglish (US)
Title of host publicationProceedings of the 7th IEEE International Conference on Data Mining, ICDM 2007
Pages421-426
Number of pages6
DOIs
StatePublished - 2007
Event7th IEEE International Conference on Data Mining, ICDM 2007 - Omaha, NE, United States
Duration: Oct 28 2007Oct 31 2007

Publication series

NameProceedings - IEEE International Conference on Data Mining, ICDM
ISSN (Print)1550-4786

Other

Other7th IEEE International Conference on Data Mining, ICDM 2007
Country/TerritoryUnited States
CityOmaha, NE
Period10/28/0710/31/07

Fingerprint

Dive into the research topics of 'Latent Dirichlet conditional naive-Bayes models'. Together they form a unique fingerprint.

Cite this