Matrix completion and extrapolation via kernel regression

Pere Giménez-Febrer, Alba Pagès-Zamora, Georgios B. Giannakis

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

Matrix completion and extrapolation (MCEX) are dealt with here over reproducing kernel Hilbert spaces (RKHSs) in order to account for prior information present in the available data. Aiming at a fast and low-complexity solver, the task is formulated as one of kernel ridge regression. The resultant MCEX algorithm can also afford online implementation, while the class of kernel functions also encompasses several existing approaches to MC with prior information. Numerical tests on synthetic and real datasets show that the novel approach is faster than widespread methods such as alternating least-squares (ALS) or stochastic gradient descent (SGD), and that the recovery error is reduced, especially when dealing with noisy data.

Original languageEnglish (US)
Article number8786233
Pages (from-to)5004-5017
Number of pages14
JournalIEEE Transactions on Signal Processing
Volume67
Issue number19
DOIs
StatePublished - Oct 1 2019
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2019 IEEE.

Keywords

  • Extrapolation
  • Graphs
  • Kernel ridge regression
  • Matrix completion
  • Online learning
  • RKHS

Fingerprint

Dive into the research topics of 'Matrix completion and extrapolation via kernel regression'. Together they form a unique fingerprint.

Cite this