The Community for Technology Leaders
Green Image
<p>The authors present a statistical-heuristic feature selection criterion for constructing multibranching decision trees in noisy real-world domains. Real world problems often have multivalued features. To these problems, multibranching decision trees provide a more efficient and more comprehensible solution that binary decision trees. The authors propose a statistical-heuristic criterion, the symmetrical tau and then discuss its consistency with a Bayesian classifier and its built-in statistical test. The combination of a measure of proportional-reduction-in-error and cost-of-complexity heuristic enables the symmetrical tau to be a powerful criterion with many merits, including robustness to noise, fairness to multivalued features, and ability to handle a Boolean combination of logical features, and middle-cut preference. The tau criterion also provides a natural basis for prepruning and dynamic error estimation. Illustrative examples are also presented.</p>
pattern recognition; statistical-heuristic feature selection criterion; decision tree induction; multibranching decision trees; Bayesian classifier; built-in statistical test; proportional-reduction-in-error; cost-of-complexity heuristic; robustness; middle-cut preference; tau criterion; prepruning; dynamic error estimation; Bayes methods; decision theory; pattern recognition; statistics; trees (mathematics)
X.J. Zhou, T.S. Dillon, "A Statistical-Heuristic Feature Selection Criterion for Decision Tree Induction", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 13, no. , pp. 834-841, August 1991, doi:10.1109/34.85676
98 ms
(Ver 3.3 (11022016))