Issue No. 08 - August (2004 vol. 26)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TPAMI.2004.37
Ravi Janardan , IEEE
<p><b>Abstract</b>—An optimization criterion is presented for discriminant analysis. The criterion extends the optimization criteria of the classical Linear Discriminant Analysis (LDA) through the use of the pseudoinverse when the scatter matrices are singular. It is applicable regardless of the relative sizes of the data dimension and sample size, overcoming a limitation of classical LDA. The optimization problem can be solved analytically by applying the Generalized Singular Value Decomposition (GSVD) technique. The pseudoinverse has been suggested and used for undersampled problems in the past, where the data dimension exceeds the number of data points. The criterion proposed in this paper provides a theoretical justification for this procedure. An approximation algorithm for the GSVD-based approach is also presented. It reduces the computational complexity by finding subclusters of each cluster and uses their centroids to capture the structure of each cluster. This reduced problem yields much smaller matrices to which the GSVD can be applied efficiently. Experiments on text data, with up to 7,000 dimensions, show that the approximation algorithm produces results that are close to those produced by the exact algorithm.</p>
Classification, clustering, dimension reduction, generalized singular value decomposition, linear discriminant analysis, text mining.
H. Park, R. Janardan, J. Ye and C. H. Park, "An Optimization Criterion for Generalized Discriminant Analysis on Undersampled Problems," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 26, no. , pp. 982-994, 2004.