This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Information Discriminant Analysis: Feature Extraction with an Information-Theoretic Objective
August 2007 (vol. 29 no. 8)
pp. 1394-1407
Using elementary information-theoretic tools, we develop a novel technique for linear transformation from the space of observations into a low-dimensional (feature) subspace for the purpose of classification. The technique is based on a numerical optimization of an information-theoretic objective function, which can be computed analytically. The advantages of the proposed method over several other techniques are discussed and the conditions under which the method reduces to linear discriminant analysis are given. We show that the novel objective function enjoys many of the properties of the mutual information and the Bayes error and we give sufficient conditions for the method to be Bayes-optimal. Since the objective function is maximized numerically, we show how the calculations can be accelerated to yield feasible solutions. The performance of the method compares favorably to other linear discriminant-based feature extraction methods on a number of simulated and real-world data sets.
Index Terms:
Feature extraction, information theory, mutual information, entropy, classification, linear discriminant analysis, Bayes error.
Citation:
Zoran Nenadic, "Information Discriminant Analysis: Feature Extraction with an Information-Theoretic Objective," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, no. 8, pp. 1394-1407, Aug. 2007, doi:10.1109/TPAMI.2007.1156
Usage of this product signifies your acceptance of the Terms of Use.