CSDL Home IEEE Transactions on Pattern Analysis & Machine Intelligence 2006 vol.28 Issue No.04 - April
Issue No.04 - April (2006 vol.28)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TPAMI.2006.82
This paper is concerned with the selection of a generative model for supervised classification. Classical criteria for model selection assess the fit of a model rather than its ability to produce a low classification error rate. A new criterion, the Bayesian Entropy Criterion (BEC), is proposed. This criterion takes into account the decisional purpose of a model by minimizing the integrated classification entropy. It provides an interesting alternative to the cross-validated error rate which is computationally expensive. The asymptotic behavior of the BEC criterion is presented. Numerical experiments on both simulated and real data sets show that BEC performs better than the BIC criterion to select a model minimizing the classification error rate and provides analogous performance to the cross-validated error rate.
Generative classification, integrated likelihood, integrated conditional likelihood, classification entropy, cross-validated error rate, AIC and BIC criteria.
Guillaume Bouchard, Gilles Celeux, "Selection of Generative Models in Classification", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.28, no. 4, pp. 544-554, April 2006, doi:10.1109/TPAMI.2006.82