This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Comments on Esposito et al.
May 1997 (vol. 19 no. 5)
pp. 492-493

Abstract—This note discusses the inferential procedure used by Esposito et al. [1] to compare the performance of methods of classification, makes some links with recent research on resampling methodology, and mentions some alternative approaches.

[1] F. Esposito, D. Malerba, and G. Semeraro, “A Comparative Analysis of Methods for Pruning Decision Trees,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 19, no. 5, pp. 476-491, 1997.
[2] R. Tibshirani, "Bias, Variance and Prediction Error for Classification Rules," Technical Report, Dept. of Statistics. Univ. of Toronto, 1996.
[3] B. Efron and R. Tibshirani, An Introduction to the Bootstrap.London: Chapman and Hall, 1993.
[4] C.R. Kohavi, "A Study of Cross Validation and Bootstrap for Accuracy Estimation and Model Selection," Proc. Int'l Joint Conference on Artificial Intelligence, 1995.
[5] C.R. Rao, Linear Statistical Inference and its Applications.New York: Wiley, 1973.
[6] J. Shao and D. Tu, Jackknife and Bootstrap.New York: Springer, 1995.
[7] B. Efron and R. Tibshirani, An Introduction to the Bootstrap.London: Chapman and Hall, 1993.
[8] C.F.J. Wu, "On the Asymptotic Properties of the Jackknife Histogram, Annals of Statistics, pp. 1,438-1,452, 1990.
[9] S. Geisser, "The Predictive Sample Use-Reuse Method With Applications," J. Am. Assoc., vol. 70, pp. 320-328, 1975.
[10] J. Shao, "Linear Model Selection by Cross Validation," J. Am. Stat. Assoc., vol. 88, pp. 486-494, 1993.

Citation:
Jim Kay, "Comments on Esposito et al.," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, no. 5, pp. 492-493, May 1997, doi:10.1109/TPAMI.1997.589208
Usage of this product signifies your acceptance of the Terms of Use.