CSDL Home IEEE Transactions on Pattern Analysis & Machine Intelligence 2010 vol.32 Issue No.08 - August

Subscribe

Issue No.08 - August (2010 vol.32)

pp: 1517-1522

Jose M. Peña , Linköping University, Linköping

Roland Nilsson , Harvard Medical School, Boston

DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TPAMI.2010.84

ABSTRACT

Consider a classification problem involving only discrete features that are represented as random variables with some prescribed discrete sample space. In this paper, we study the complexity of two feature selection problems. The first problem consists in finding a feature subset of a given size k that has minimal Bayes risk. We show that for any increasing ordering of the Bayes risks of the feature subsets (consistent with an obvious monotonicity constraint), there exists a probability distribution that exhibits that ordering. This implies that solving the first problem requires an exhaustive search over the feature subsets of size k. The second problem consists of finding the minimal feature subset that has minimal Bayes risk. In the light of the complexity of the first problem, one may think that solving the second problem requires an exhaustive search over all of the feature subsets. We show that, under mild assumptions, this is not true. We also study the practical implications of our solutions to the second problem.

INDEX TERMS

Feature evaluation and selection, classifier design and evaluation, machine learning.

CITATION

Jose M. Peña, Roland Nilsson, "On the Complexity of Discrete Feature Selection for Optimal Classification",

*IEEE Transactions on Pattern Analysis & Machine Intelligence*, vol.32, no. 8, pp. 1517-1522, August 2010, doi:10.1109/TPAMI.2010.84REFERENCES