The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.03 - March (1975 vol.24)
pp: 281-289
D.H. Foley , Pattern Analysis and Recognition Corporation
ABSTRACT
A new method for the extraction of features in a two-class pattern recognition problem is derived. The main advantage is that the method for selecting features is based entirely upon discrimination or separability as opposed to the more common approach of fitting. The classical example of fitting is the use of the eigenvectors of the lumped covariance matrix corresponding to the largest eigenvalues. In an analogous manner, the new technique selects discriminant vectors (or features) corresponding to the largest "discrim-values." The new method is compared to some of the more popular alternative techniques via both data-dependent and mathematical examples. In addition, a recursive method for obtaining the discriminant vectors is given.
INDEX TERMS
Dimensionality reduction, discriminants, eigenvectors, feature extraction, feature ranking, feature selection, Karhunen-Loeve expansions, multivariate data-analysis, pattern classification, pattern recognition.
CITATION
D.H. Foley, J.W. Sammon, "An Optimal Set of Discriminant Vectors", IEEE Transactions on Computers, vol.24, no. 3, pp. 281-289, March 1975, doi:10.1109/T-C.1975.224208
20 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool