This paper presents a novel sparse approximation method for online cla.ssifica.tion in Reproducing Kernel Hilbert Spaces (RKHS) by exploiting adaptive projection-based algorithms. We use convex analysis to revisit the standard kernel-based classification task as the problem of finding a point that belongs to a closed halfspace (a special closed convex set) in an RKHS. In such a way, classification in an online setting, where data arrive sequentially, is treated as the task of finding a point in the nonempty intersection of an infinite sequence of closed halfspaces in RKHS. Convex analysis is also used to introduce sparsification arguments in the design by imposing a simple convex constraint on the norm of the classifier. An algorithmic solution to this optimization problem, where new convex constraints are added every time instant, is given by the recently introduced Adaptive Projected Subgradient Method (APSM) which unifies a number of well-known adaptive projection-based algorithms such as the classical Normalized Least Mean Squares (NLMS) and the Affine Projection Algorithm (APA). Several theoretical results are established for the generated sequence of classifiers in the RKHS: monotone approximation, strong convergence, asymptotic optimality, and characterization of the limit classifier. Further, we show that the additional convex constraint on the norm of the classifier naturally leads to an online sparse approximation of the resulting kernel series expansion. The validation of the proposed method is performed by considering the adaptive equalization of a nonlinear communication channel.