This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
March 1975 (vol. 24 no. 3)
pp. 281-289
D.H. Foley, Pattern Analysis and Recognition Corporation
A new method for the extraction of features in a two-class pattern recognition problem is derived. The main advantage is that the method for selecting features is based entirely upon discrimination or separability as opposed to the more common approach of fitting. The classical example of fitting is the use of the eigenvectors of the lumped covariance matrix corresponding to the largest eigenvalues. In an analogous manner, the new technique selects discriminant vectors (or features) corresponding to the largest "discrim-values." The new method is compared to some of the more popular alternative techniques via both data-dependent and mathematical examples. In addition, a recursive method for obtaining the discriminant vectors is given.
Index Terms:
Dimensionality reduction, discriminants, eigenvectors, feature extraction, feature ranking, feature selection, Karhunen-Loeve expansions, multivariate data-analysis, pattern classification, pattern recognition.
Citation:
D.H. Foley, J.W. Sammon, "An Optimal Set of Discriminant Vectors," IEEE Transactions on Computers, vol. 24, no. 3, pp. 281-289, March 1975, doi:10.1109/T-C.1975.224208
Usage of this product signifies your acceptance of the Terms of Use.