The Community for Technology Leaders
RSS Icon
Issue No.10 - October (2003 vol.25)
pp: 1312-1317
<p><b>Abstract</b>—This paper is focused on the problems of feature selection and classification when classes are modeled by statistically independent features. We show that, under the assumption of class-conditional independence, the class separability measure of divergence is greatly simplified, becoming a sum of unidimensional divergences, providing a feature selection criterion where no exhaustive search is required. Since the hypothesis of independence is infrequently met in practice, we also provide a framework making use of class-conditional Independent Component Analyzers where this assumption can be held on stronger grounds. Divergence and the Bayes decision scheme are adapted to this class-conditional representation. An algorithm that integrates the proposed representation, feature selection technique, and classifier is presented. Experiments on artificial, benchmark, and real-world data illustrate our technique and evaluate its performance.</p>
Feature selection, divergence, independent component analysis, naive Bayes.
Marco Bressan, Jordi Vitri?, "On the Selection and Classification of Independent Features", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.25, no. 10, pp. 1312-1317, October 2003, doi:10.1109/TPAMI.2003.1233904
21 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool