Issue No. 06 - June (2004 vol. 26)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TPAMI.2004.13
Robert P.W. Duin , IEEE
<p><b>Abstract</b>—We propose an eigenvector-based <it>heteroscedastic</it> linear dimension reduction (LDR) technique for multiclass data. The technique is based on a heteroscedastic two-class technique which utilizes the so-called <it>Chernoff criterion</it>, and successfully extends the well-known <it>linear discriminant analysis</it> (LDA). The latter, which is based on the <it>Fisher criterion</it>, is incapable of dealing with heteroscedastic data in a proper way. For the two-class case, the between-class scatter is generalized so to capture differences in (co)variances. It is shown that the classical notion of between-class scatter can be associated with Euclidean distances between class means. From this viewpoint, the between-class scatter is generalized by employing the Chernoff distance measure, leading to our proposed heteroscedastic measure. Finally, using the results from the two-class case, a multiclass extension of the Chernoff criterion is proposed. This criterion combines separation information present in the class mean as well as the class covariance matrices. Extensive experiments and a comparison with similar dimension reduction techniques are presented.</p>
Linear dimension reduction, linear discriminant analysis, Fisher criterion, Chernoff distance, Chernoff criterion.
M. Loog and R. P. Duin, "Linear Dimensionality Reduction via a Heteroscedastic Extension of LDA: The Chernoff Criterion," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 26, no. , pp. 732-739, 2004.