In recent years, mixture models have found widespread usage in discovering latent cluster structure from data. A popular special case of finite mixturemodels is the family of naive Bayes (NB) models, where the probability of a feature vector factorizes over the features for any given component of the mixture. Despite their popularity, naive Bayes models do not allow data points to belong to different component clusters with varying degrees, i.e., mixed memberships, which puts a restriction on their modeling ability. In this paper, we propose mixed-membership naïve Bayes (MMNB) models. On one hand, MMNB can be viewed as a generalization of NB by putting a Dirichlet prior on top to allow mixed memberships. On the other hand, MMNB can also be viewed as a generalization of latent Dirichlet allocation (LDA) with the ability to handle heterogeneous feature vectors with different types of features, e.g., real, categorical, etc.. We propose two variational inference algorithms to learnMMNBmodels. The first one is based on ideas originally used in LDA, and the second one uses substantially fewer variational parameters, leading to a significantly faster algorithm. Further, we extend MMNB/LDA to discriminative mixed-membership models for classification by suitably combining MMNB/LDA with multi-class logistic regression. The efficacy of the proposed mixed-membership models is demonstrated by extensive experiments on several datasets, including UCI benchmarks, recommendation systems, and text datasets.
Bibliographical noteFunding Information:
Acknowledgements We want to warmly thank Nikunj Oza for valuable input on discriminative mixed membership models. The research was supported by National Science Foundation grants IIS-0812183, IIS-0916750, National Science Foundation CAREER grant IIS-0953274, and National Aeronautics and Space Administration grant NNX08AC36A.
- Generative models
- Latent Dirichlet allocation
- Logistic regression
- Naive Bayes
- Variational inference