Issue No. 10 - October (2003 vol. 25)
<p><b>Abstract</b>—This paper is focused on the problems of feature selection and classification when classes are modeled by statistically independent features. We show that, under the assumption of class-conditional independence, the class separability measure of divergence is greatly simplified, becoming a sum of unidimensional divergences, providing a feature selection criterion where no exhaustive search is required. Since the hypothesis of independence is infrequently met in practice, we also provide a framework making use of class-conditional Independent Component Analyzers where this assumption can be held on stronger grounds. Divergence and the Bayes decision scheme are adapted to this class-conditional representation. An algorithm that integrates the proposed representation, feature selection technique, and classifier is presented. Experiments on artificial, benchmark, and real-world data illustrate our technique and evaluate its performance.</p>
Feature selection, divergence, independent component analysis, naive Bayes.
J. Vitri? and M. Bressan, "On the Selection and Classification of Independent Features," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 25, no. , pp. 1312-1317, 2003.