Proximal gradient method for huberized support vector machine

Yangyang Xu, Ioannis Akrotirianakis, Amit Chakraborty

Research output: Contribution to journalArticlepeer-review

17 Scopus citations

Abstract

The support vector machine (SVM) has been used in a wide variety of classification problems. The original SVM uses the hinge loss function, which is non-differentiable and makes the problem difficult to solve in particular for regularized SVMs, such as with ℓ1-regularization. This paper considers the Huberized SVM (HSVM), which uses a differentiable approximation of the hinge loss function. We first explore the use of the proximal gradient (PG) method to solving binary-class HSVM (B-HSVM) and then generalize it to multi-class HSVM (M-HSVM). Under strong convexity assumptions, we show that our algorithm converges linearly. In addition, we give a finite convergence result about the support of the solution, based on which we further accelerate the algorithm by a two-stage method. We present extensive numerical experiments on both synthetic and real datasets which demonstrate the superiority of our methods over some state-of-the-art methods for both binary- and multi-class SVMs.

Original languageEnglish (US)
Pages (from-to)989-1005
Number of pages17
JournalPattern Analysis and Applications
Volume19
Issue number4
DOIs
StatePublished - Nov 1 2016
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2015, Springer-Verlag London.

Keywords

  • Binary and multiclass classification problems
  • Elastic net
  • Huberized hinge loss
  • Linear convergence
  • Proximal gradient method
  • Support vector machine

Fingerprint

Dive into the research topics of 'Proximal gradient method for huberized support vector machine'. Together they form a unique fingerprint.

Cite this