TY - GEN

T1 - Online sparse kernel-based classification by projections

AU - Slavakis, Konstantinos

AU - Theodoridis, Sergios

AU - Yamada, Isao

PY - 2007/12/1

Y1 - 2007/12/1

N2 - This paper presents a novel sparse approximation method for online cla.ssifica.tion in Reproducing Kernel Hilbert Spaces (RKHS) by exploiting adaptive projection-based algorithms. We use convex analysis to revisit the standard kernel-based classification task as the problem of finding a point that belongs to a closed halfspace (a special closed convex set) in an RKHS. In such a way, classification in an online setting, where data arrive sequentially, is treated as the task of finding a point in the nonempty intersection of an infinite sequence of closed halfspaces in RKHS. Convex analysis is also used to introduce sparsification arguments in the design by imposing a simple convex constraint on the norm of the classifier. An algorithmic solution to this optimization problem, where new convex constraints are added every time instant, is given by the recently introduced Adaptive Projected Subgradient Method (APSM) which unifies a number of well-known adaptive projection-based algorithms such as the classical Normalized Least Mean Squares (NLMS) and the Affine Projection Algorithm (APA). Several theoretical results are established for the generated sequence of classifiers in the RKHS: monotone approximation, strong convergence, asymptotic optimality, and characterization of the limit classifier. Further, we show that the additional convex constraint on the norm of the classifier naturally leads to an online sparse approximation of the resulting kernel series expansion. The validation of the proposed method is performed by considering the adaptive equalization of a nonlinear communication channel.

AB - This paper presents a novel sparse approximation method for online cla.ssifica.tion in Reproducing Kernel Hilbert Spaces (RKHS) by exploiting adaptive projection-based algorithms. We use convex analysis to revisit the standard kernel-based classification task as the problem of finding a point that belongs to a closed halfspace (a special closed convex set) in an RKHS. In such a way, classification in an online setting, where data arrive sequentially, is treated as the task of finding a point in the nonempty intersection of an infinite sequence of closed halfspaces in RKHS. Convex analysis is also used to introduce sparsification arguments in the design by imposing a simple convex constraint on the norm of the classifier. An algorithmic solution to this optimization problem, where new convex constraints are added every time instant, is given by the recently introduced Adaptive Projected Subgradient Method (APSM) which unifies a number of well-known adaptive projection-based algorithms such as the classical Normalized Least Mean Squares (NLMS) and the Affine Projection Algorithm (APA). Several theoretical results are established for the generated sequence of classifiers in the RKHS: monotone approximation, strong convergence, asymptotic optimality, and characterization of the limit classifier. Further, we show that the additional convex constraint on the norm of the classifier naturally leads to an online sparse approximation of the resulting kernel series expansion. The validation of the proposed method is performed by considering the adaptive equalization of a nonlinear communication channel.

UR - http://www.scopus.com/inward/record.url?scp=45749130079&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=45749130079&partnerID=8YFLogxK

U2 - 10.1109/MLSP.2007.4414322

DO - 10.1109/MLSP.2007.4414322

M3 - Conference contribution

AN - SCOPUS:45749130079

SN - 1424415667

SN - 9781424415663

T3 - Machine Learning for Signal Processing 17 - Proceedings of the 2007 IEEE Signal Processing Society Workshop, MLSP

SP - 294

EP - 299

BT - Machine Learning for Signal Processing 17 - Proceedings of the 2007 IEEE Signal Processing Society Workshop, MLSP

T2 - 17th IEEE International Workshop on Machine Learning for Signal Processing, MLSP-2007

Y2 - 27 August 2007 through 29 August 2007

ER -