Issue No. 08 - August (2007 vol. 29)
Zoran Nenadic , IEEE
Using elementary information-theoretic tools, we develop a novel technique for linear transformation from the space of observations into a low-dimensional (feature) subspace for the purpose of classification. The technique is based on a numerical optimization of an information-theoretic objective function, which can be computed analytically. The advantages of the proposed method over several other techniques are discussed and the conditions under which the method reduces to linear discriminant analysis are given. We show that the novel objective function enjoys many of the properties of the mutual information and the Bayes error and we give sufficient conditions for the method to be Bayes-optimal. Since the objective function is maximized numerically, we show how the calculations can be accelerated to yield feasible solutions. The performance of the method compares favorably to other linear discriminant-based feature extraction methods on a number of simulated and real-world data sets.
Feature extraction, information theory, mutual information, entropy, classification, linear discriminant analysis, Bayes error.
Z. Nenadic, "Information Discriminant Analysis: Feature Extraction with an Information-Theoretic Objective," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 29, no. , pp. 1394-1407, 2007.