Probabilistic matrix addition

Research output: Chapter in Book/Report/Conference proceedingConference contribution

9 Scopus citations

Abstract

We introduce Probabilistic Matrix Addition (PMA) for modeling real-valued data matrices by simultaneously capturing covariance structure among rows and among columns. PMA additively combines two latent matrices drawn from two Gaussian Processes respectively over rows and columns. The resulting joint distribution over the observed matrix does not factorize over entries, rows, or columns, and can thus capture intricate dependencies in the matrix. Exact inference in PMA is possible, but involves inversion of large matrices, and can be computationally prohibitive. Efficient approximate inference is possible due to the sparse dependency structure among latent variables. We propose two families of approximate inference algorithms for PMA based on Gibbs sampling and MAP inference. We demonstrate the effectiveness of PMA for missing value prediction and multi-label classification problems.

Original languageEnglish (US)
Title of host publicationProceedings of the 28th International Conference on Machine Learning, ICML 2011
Pages1025-1032
Number of pages8
StatePublished - 2011
Event28th International Conference on Machine Learning, ICML 2011 - Bellevue, WA, United States
Duration: Jun 28 2011Jul 2 2011

Publication series

NameProceedings of the 28th International Conference on Machine Learning, ICML 2011

Other

Other28th International Conference on Machine Learning, ICML 2011
Country/TerritoryUnited States
CityBellevue, WA
Period6/28/117/2/11

Fingerprint

Dive into the research topics of 'Probabilistic matrix addition'. Together they form a unique fingerprint.

Cite this