The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.02 - February (1996 vol.18)
pp: 218-223
ABSTRACT
<p><b>Abstract</b>—A new feature selection procedure based on the Kullback J-divergence between two class conditional density functions approximated by a finite mixture of parameterized densities of a special type is presented. This procedure is suitable especially for multimodal data. Apart from finding a feature subset of any cardinality without involving any search procedure, it also simultaneously yields a pseudo-Bayes decision rule. Its performance is tested on real data.</p>
INDEX TERMS
Feature selection, feature ordering, mixture distribution, maximum likelihood, EM algorithm, Kullback J-divergence.
CITATION
Jana Novovicová, Pavel Pudil, Josef Kittler, "Divergence Based Feature Selection for Multimodal Class Densities", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.18, no. 2, pp. 218-223, February 1996, doi:10.1109/34.481557
18 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool