The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.04 - April (2006 vol.28)
pp: 544-554
ABSTRACT
This paper is concerned with the selection of a generative model for supervised classification. Classical criteria for model selection assess the fit of a model rather than its ability to produce a low classification error rate. A new criterion, the Bayesian Entropy Criterion (BEC), is proposed. This criterion takes into account the decisional purpose of a model by minimizing the integrated classification entropy. It provides an interesting alternative to the cross-validated error rate which is computationally expensive. The asymptotic behavior of the BEC criterion is presented. Numerical experiments on both simulated and real data sets show that BEC performs better than the BIC criterion to select a model minimizing the classification error rate and provides analogous performance to the cross-validated error rate.
INDEX TERMS
Generative classification, integrated likelihood, integrated conditional likelihood, classification entropy, cross-validated error rate, AIC and BIC criteria.
CITATION
Guillaume Bouchard, Gilles Celeux, "Selection of Generative Models in Classification", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.28, no. 4, pp. 544-554, April 2006, doi:10.1109/TPAMI.2006.82
7 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool