The Community for Technology Leaders
Green Image
Issue No. 06 - June (2009 vol. 31)
ISSN: 0162-8828
pp: 1017-1032
Bernard Haasdonk , Univeristy of Stuttgart, Stuttgart
Elżbieta Pȩkalska , Univeristy of Manchester, Manchester
Kernel methods are a class of well established and successful algorithms for pattern analysis thanks to their mathematical elegance and good performance. Numerous nonlinear extensions of pattern recognition techniques have been proposed so far based on the so-called kernel trick. The objective of this paper is twofold. First, we derive an additional kernel tool that is still missing, namely kernel quadratic discriminant (KQD). We discuss different formulations of KQD based on the regularized kernel Mahalanobis distance in both complete and class-related subspaces. Secondly, we propose suitable extensions of kernel linear and quadratic discriminants to indefinite kernels. We provide classifiers that are applicable to kernels defined by any symmetric similarity measure. This is important in practice because problem-suited proximity measures often violate the requirement of positive definiteness. As in the traditional case, KQD can be advantageous for data with unequal class spreads in the kernel-induced spaces, which cannot be well separated by a linear discriminant. We illustrate this on artificial and real data for both positive definite and indefinite kernels.
machine learning, pattern recognition, kernel methods, indefinite kernels, quadratic discriminant
Bernard Haasdonk, Elżbieta Pȩkalska, "Kernel Discriminant Analysis for Positive Definite and Indefinite Kernels", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 31, no. , pp. 1017-1032, June 2009, doi:10.1109/TPAMI.2008.290
106 ms
(Ver 3.1 (10032016))