Fast convergent algorithms for multi-kernel regression

Liang Zhang, Daniel Romero, Georgios B. Giannakis

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Scopus citations

Abstract

Kernel ridge regression plays a central role in various signal processing and machine learning applications. Suitable kernels are often chosen as linear combinations of 'basis kernels' by optimizing criteria under regularization constraints. Although such approaches offer reliable generalization performance, solving the associated min-max optimization problems face major challenges, especially with big data inputs. After analyzing the key properties of a convex reformulation, the present paper introduces an efficient algorithm based on a generalization of Nesterov's acceleration method, which achieves order-optimal convergence rate among first-order methods. Closed-form updates are derived for common regularizers. Experiments on real datasets corroborate considerable speedup advantages over competing algorithms.

Original languageEnglish (US)
Title of host publication2016 19th IEEE Statistical Signal Processing Workshop, SSP 2016
PublisherIEEE Computer Society
ISBN (Electronic)9781467378024
DOIs
StatePublished - Aug 24 2016
Event19th IEEE Statistical Signal Processing Workshop, SSP 2016 - Palma de Mallorca, Spain
Duration: Jun 25 2016Jun 29 2016

Publication series

NameIEEE Workshop on Statistical Signal Processing Proceedings
Volume2016-August

Other

Other19th IEEE Statistical Signal Processing Workshop, SSP 2016
Country/TerritorySpain
CityPalma de Mallorca
Period6/25/166/29/16

Bibliographical note

Publisher Copyright:
© 2016 IEEE.

Keywords

  • Bregman divergence
  • Kernel ridge regression
  • Nesterov's accelerated gradient method
  • multi-kernel learning

Fingerprint

Dive into the research topics of 'Fast convergent algorithms for multi-kernel regression'. Together they form a unique fingerprint.

Cite this