We propose a novel extension of the 1-norm support vector machine (SVM) for simultaneous feature selection and classification. The new algorithm penalizes the empirical hinge loss by the adaptively weighted 1-norm penalty in which the weights are computed by the 2-norm SVM. Hence the new algorithm is called the hybrid SVM. Simulation and real data examples show that the hybrid SVM not only often improves upon the 1-norm SVM in terms of classification accuracy but also enjoys better feature selection performance.
|Original language||English (US)|
|Number of pages||7|
|Journal||Journal of Machine Learning Research|
|State||Published - Dec 1 2007|
|Event||11th International Conference on Artificial Intelligence and Statistics, AISTATS 2007 - San Juan, Puerto Rico|
Duration: Mar 21 2007 → Mar 24 2007