Abstract
Despite their well-documented capability in modeling nonlinear functions, kernel methods fall short in large-scale learning tasks due to their excess memory and computational requirements. The present work introduces a novel kernel approximation approach from a dimensionality reduction point of view on virtual lifted data. The proposed framework accommodates feature extraction while considering limited storage and computational availability, and subsequently provides kernel approximation by a linear inner-product over the extracted features. Probabilistic guarantees on the generalization of the proposed task is provided, and efficient solvers with provable convergence guarantees are developed. By introducing a sampling step which precedes the dimensionality reduction task, the framework is further broadened to accommodate learning over large datasets. The connection between the novel method and Nystrom kernel approximation algorithm with its modifications is also presented. Empirical tests validate the effectiveness of the proposed approach.
Original language | English (US) |
---|---|
Title of host publication | 55th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2017 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 596-603 |
Number of pages | 8 |
ISBN (Electronic) | 9781538632666 |
DOIs | |
State | Published - Jul 1 2017 |
Externally published | Yes |
Event | 55th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2017 - Monticello, United States Duration: Oct 3 2017 → Oct 6 2017 |
Publication series
Name | 55th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2017 |
---|---|
Volume | 2018-January |
Other
Other | 55th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2017 |
---|---|
Country/Territory | United States |
City | Monticello |
Period | 10/3/17 → 10/6/17 |
Bibliographical note
Publisher Copyright:© 2017 IEEE.