Abstract
Kernel ridge regression plays a central role in various signal processing and machine learning applications. Suitable kernels are often chosen as linear combinations of 'basis kernels' by optimizing criteria under regularization constraints. Although such approaches offer reliable generalization performance, solving the associated min-max optimization problems face major challenges, especially with big data inputs. After analyzing the key properties of a convex reformulation, the present paper introduces an efficient algorithm based on a generalization of Nesterov's acceleration method, which achieves order-optimal convergence rate among first-order methods. Closed-form updates are derived for common regularizers. Experiments on real datasets corroborate considerable speedup advantages over competing algorithms.
Original language | English (US) |
---|---|
Title of host publication | 2016 19th IEEE Statistical Signal Processing Workshop, SSP 2016 |
Publisher | IEEE Computer Society |
ISBN (Electronic) | 9781467378024 |
DOIs | |
State | Published - Aug 24 2016 |
Event | 19th IEEE Statistical Signal Processing Workshop, SSP 2016 - Palma de Mallorca, Spain Duration: Jun 25 2016 → Jun 29 2016 |
Publication series
Name | IEEE Workshop on Statistical Signal Processing Proceedings |
---|---|
Volume | 2016-August |
Other
Other | 19th IEEE Statistical Signal Processing Workshop, SSP 2016 |
---|---|
Country/Territory | Spain |
City | Palma de Mallorca |
Period | 6/25/16 → 6/29/16 |
Bibliographical note
Publisher Copyright:© 2016 IEEE.
Keywords
- Bregman divergence
- Kernel ridge regression
- Nesterov's accelerated gradient method
- multi-kernel learning