Several kernel algorithms have recently been proposed for nonlinear discriminant analysis. However, these methods mainly address the singularity problem in the high dimensional feature space. Less attention has been focused on the properties of the resulting discriminant vectors and feature vectors in the reduced dimensional space. In this paper, we present a new formulation for kernel discriminant analysis. The proposed formulation includes, as special cases, kernel uncorrelated discriminant analysis (KUDA) and kernel orthogonal discriminant analysis (KODA). The feature vectors of KUDA are uncorrelated, while the discriminant vectors of KODA are orthogonal to each other in the feature space. We present theoretical derivations of proposed KUDA and KODA algorithms. The experimental results show that both KUDA and KODA are very competitive in comparison with other nonlinear discriminant algorithms in terms of classification accuracy.
|Original language||English (US)|
|Title of host publication||Proceedings - 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2006|
|Number of pages||7|
|State||Published - 2006|
|Event||2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2006 - New York, NY, United States|
Duration: Jun 17 2006 → Jun 22 2006
|Name||Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition|
|Other||2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2006|
|City||New York, NY|
|Period||6/17/06 → 6/22/06|
Copyright 2011 Elsevier B.V., All rights reserved.