Abstract
The support vector machine (SVM) has been used in a wide variety of classification problems. The original SVM uses the hinge loss function, which is non-differentiable and makes the problem difficult to solve in particular for regularized SVMs, such as with ℓ1-regularization. This paper considers the Huberized SVM (HSVM), which uses a differentiable approximation of the hinge loss function. We first explore the use of the proximal gradient (PG) method to solving binary-class HSVM (B-HSVM) and then generalize it to multi-class HSVM (M-HSVM). Under strong convexity assumptions, we show that our algorithm converges linearly. In addition, we give a finite convergence result about the support of the solution, based on which we further accelerate the algorithm by a two-stage method. We present extensive numerical experiments on both synthetic and real datasets which demonstrate the superiority of our methods over some state-of-the-art methods for both binary- and multi-class SVMs.
Original language | English (US) |
---|---|
Pages (from-to) | 989-1005 |
Number of pages | 17 |
Journal | Pattern Analysis and Applications |
Volume | 19 |
Issue number | 4 |
DOIs | |
State | Published - Nov 1 2016 |
Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2015, Springer-Verlag London.
Keywords
- Binary and multiclass classification problems
- Elastic net
- Huberized hinge loss
- Linear convergence
- Proximal gradient method
- Support vector machine