The Community for Technology Leaders
Green Image
Issue No. 08 - August (2010 vol. 32)
ISSN: 0162-8828
pp: 1517-1522
Jose M. Peña , Linköping University, Linköping
Roland Nilsson , Harvard Medical School, Boston
ABSTRACT
Consider a classification problem involving only discrete features that are represented as random variables with some prescribed discrete sample space. In this paper, we study the complexity of two feature selection problems. The first problem consists in finding a feature subset of a given size k that has minimal Bayes risk. We show that for any increasing ordering of the Bayes risks of the feature subsets (consistent with an obvious monotonicity constraint), there exists a probability distribution that exhibits that ordering. This implies that solving the first problem requires an exhaustive search over the feature subsets of size k. The second problem consists of finding the minimal feature subset that has minimal Bayes risk. In the light of the complexity of the first problem, one may think that solving the second problem requires an exhaustive search over all of the feature subsets. We show that, under mild assumptions, this is not true. We also study the practical implications of our solutions to the second problem.
INDEX TERMS
Feature evaluation and selection, classifier design and evaluation, machine learning.
CITATION
Jose M. Peña, Roland Nilsson, "On the Complexity of Discrete Feature Selection for Optimal Classification", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 32, no. , pp. 1517-1522, August 2010, doi:10.1109/TPAMI.2010.84
105 ms
(Ver )