This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria
July 2001 (vol. 23 no. 7)
pp. 762-766

Abstract—We derive a class of computationally inexpensive linear dimension reduction criteria by introducing a weighted variant of the well-known K-class Fisher criterion associated with linear discriminant analysis (LDA). It can be seen that LDA weights contributions of individual class pairs according to the Euclidian distance of the respective class means. We generalize upon LDA by introducing a different weighting function.

[1] L.J. Buturovic, Toward Bayes-Optimal Linear Dimension Reduction IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 16, pp. 420-424, 1994.
[2] N.A. Campbell, “Canonical Variate Analysis—A General Model Formulation,” Australian J. Statistics, vol. 26, pp. 86-96, 1984.
[3] H.P. Decell and S.M. Mayekar, “Feature Combinations and the Divergence Criterion,” Computing and Math. with Application, vol. 3, pp. 71-76, 1977.
[4] R.P.W. Duin, M. Loog, and R. Haeb-Umbach, “Multi-Class Linear Feature Extraction by Nonlinear PCA,” Proc. Int'l Conf. Pattern Recognition, 2000.
[5] R.A. Fisher, “The Statistical Utilization of Multiple Measurements,” Ann. Eugenics, vol. 8 pp. 376-386, 1938.
[6] K. Fukunaga, Introduction to Statistical Pattern Recognition, second edition. Academic Press, 1990.
[7] T. Hastie and R. Tibshirani, “Discriminant Analysis by Gaussian Mixtures.” J. Royal Statistics Soc., B, vol. 58, pp. 155-176, 1996.
[8] J. Kittler, “Feature Selection and Extraction,” Handbook of Pattern Recognition and Image Processing. Academic Press, 1986.
[9] N. Kumar and A.G. Andreou, “Hetroscedastic Discriminant Analysis and Reduced Rank HMMs for Improved Speech Recognition,” Speech Comm., vol. 26, pp. 283-297, 1998.
[10] M. Loog, Approximate Pairwise Accuracy Criteria for Multiclass Linear Dimension Reduction: Generalisations of the Fisher Criterion. Delft Univ. Press, 1999.
[11] D. Michie, D.J. Spiegelhalter, and C.C. Taylor, Machine Learning, Neural and Statistical Classification. Ellis Horwood, 1994.
[12] E. Oja, “The Nonlinear PCA Learning Rule in Independent Component Analysis,” Neurocomputing, vol. 17, pp. 25-45, 1997.
[13] C.R. Rao, “The Utilization of Multiple Measurements in Problems of Biological Classification,” J. Royal Statistical Soc., B, vol. 10, pp. 159-203, 1948.
[14] D.M. Young and P.L. Odell, “A Formulation and Comparison of Two Linear Feature Selection Techniques Applicable to Statistical Classification,” Pattern Recognition, vol. 17, pp. 331-337, 1984.

Index Terms:
Linear dimension reduction, Fisher criterion, linear discriminant analysis, Bayes error, approximate pairwise accuracy criterion.
Citation:
Marco Loog, R.p.w. Duin, R. Haeb-Umbach, "Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 23, no. 7, pp. 762-766, July 2001, doi:10.1109/34.935849
Usage of this product signifies your acceptance of the Terms of Use.