Issue No. 03 - March (1993 vol. 15)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/34.204915
<p>A new approach for estimating classification errors is presented. In the model, there are two types of classification error: empirical and generalization error. The first is the error observed over the training samples, and the second is the discrepancy between the error probability and empirical error. In this research, the Vapnik and Chervonenkis dimension (VCdim) is used as a measure for classifier complexity. Based on this complexity measure, an estimate for generalization error is developed. An optimal classifier design criterion (the generalized minimum empirical error criterion (GMEE)) is used. The GMEE criterion consists of two terms: the empirical and the estimate of generalization error. As an application, the criterion is used to design the optimal neural network classifier. A corollary to the Gamma optimality of neural-network-based classifiers is proven. Thus, the approach provides a theoretic foundation for the connectionist approach to optimal classifier design. Experimental results to validate this approach.</p>
classification error estimation; image recognition; Vapnik-Chervonenkis dimension; asymptotically optimal adaptive classifier; design criterion; generalization error; error probability; classifier complexity; generalized minimum empirical error criterion; neural network classifier; error analysis; estimation theory; image recognition; neural nets; optimisation
W. Lee and M. Tenorio, "On an Asymptotically Optimal Adaptive Classifier Design Criterion," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 15, no. , pp. 312-318, 1993.