
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
Probal Chaudhuri, Anil K. Ghosh, Hannu Oja, "Classification Based on Hybridization of Parametric and Nonparametric Classifiers," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 7, pp. 11531164, July, 2009.  
BibTex  x  
@article{ 10.1109/TPAMI.2008.149, author = {Probal Chaudhuri and Anil K. Ghosh and Hannu Oja}, title = {Classification Based on Hybridization of Parametric and Nonparametric Classifiers}, journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence}, volume = {31}, number = {7}, issn = {01628828}, year = {2009}, pages = {11531164}, doi = {http://doi.ieeecomputersociety.org/10.1109/TPAMI.2008.149}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Pattern Analysis and Machine Intelligence TI  Classification Based on Hybridization of Parametric and Nonparametric Classifiers IS  7 SN  01628828 SP1153 EP1164 EPD  11531164 A1  Probal Chaudhuri, A1  Anil K. Ghosh, A1  Hannu Oja, PY  2009 KW  Bayes risk KW  bandwidth KW  kernel density estimation KW  LDA KW  misclassification rate KW  multiscale smoothing KW  nearest neighbor KW  QDA. VL  31 JA  IEEE Transactions on Pattern Analysis and Machine Intelligence ER   
[1] L. Breiman, J.H. Friedman, R.A. Olshen, and C.J. Stone, Classification and Regression Trees. Wadsworth and Brooks Press, 1984.
[2] L. Breiman, “Bagging Predictors,” Machine Learning, vol. 24, pp.123140, 1996.
[3] C. Bolance, M. Guillen, and J.P. Nielsen, “Kernel Density Estimation of Actuarial Loss Functions,” Insurance: Math. and Economics, vol. 32, pp. 1936, 2003.
[4] T. BuchLarsen, J.P. Nielsen, M. Guillen, and C. Bolance, “Kernel Density Estimation for HeavyTailed Distributions Using the Champernowne Transformation,” Statistics, vol. 39, pp. 503518, 2005.
[5] P. Chaudhuri and J.S. Marron, “SiZer for Exploration of Structures in Curves,” J. Am. Statistical Assoc., vol. 94, pp. 807823, 1999.
[6] P. Chaudhuri and J.S. Marron, “Scale Space View of Curve Estimation,” Annals of Statistics, vol. 28, pp. 408428, 2000.
[7] T.M. Cover and P.E. Hart, “Nearest Neighbor Pattern Classification,” IEEE Trans. Information Theory, vol. 13, pp. 2127, 1967.
[8] B.V. Dasarathy, Nearest Neighbor (NN) Norms: NN Pattern Classification Techniques. IEEE CS, 1991.
[9] R. Duda, P. Hart, and D.G. Stork, Pattern Classification. John Wiley & Sons, 2000.
[10] R.A. Fisher, “The Use of Multiple Measurements in Taxonomic Problems,” Annals of Eugenics, vol. 7, pp. 179188, 1936.
[11] E. Fix and J.L. Hodges Jr., “Discriminatory Analysis, Nonparametric Discrimination, Consistency Properties,” Report No. 4, Project 2149004, 1951.
[12] J.H. Friedman, “Flexible Metric Nearest Neighbor Classification,” technical report, Dept. of Statistics, Stanford Univ., 1994.
[13] J.H. Friedman, T. Hastie, and R. Tibshirani, “Additive Logistic Regression: A Statistical View of Boosting (with Discussion),” Annals of Statistics, vol. 28, pp. 337374, 2000.
[14] K. Fukunaga and L.D. Hostetler, “Optimization of $k$ Nearest Neighbor Density Estimates,” IEEE Trans. Information Theory, vol. 19, pp. 320326, 1973.
[15] A.K. Ghosh, P. Chaudhuri, and C.A. Murthy, “On Visualization and Aggregation of Nearest Neighbor Classifiers,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 27, no. 10, pp. 15921602, Oct. 2005.
[16] A.K. Ghosh, P. Chaudhuri, and D. Sengupta, “Classification Using Kernel Density Estimates: MultiScale Analysis and Visualization,” Technometrics, vol. 48, pp. 120132, 2006.
[17] A.K. Ghosh and S. Bose, “Feature Extraction for Classification Using Statistical Networks,” Int'l J. Pattern Recognition and Artificial Intelligence, vol. 21, pp. 11031126, 2007.
[18] I. Glad, “Parametrically Guided Nonparametric Regression,” Scandinavian J. Statistics, vol. 25, pp. 649668, 1998.
[19] F. Godtliebsen, J.S. Marron, and P. Chaudhuri, “Significance in Scale Space for Bivariate Density Estimation,” J. Computational and Graphical Statistics, vol. 11, pp. 122, 2002.
[20] D.J. Hand, Kernel Discriminant Analysis. John Wiley & Sons, 1982.
[21] T. Hastie, R. Tibshirani, and A. Buja, “Flexible Discriminant Analysis,” J. Am. Statistical Assoc., vol. 89, pp. 12551270, 1994.
[22] T. Hastie and R. Tibshirani, “Discriminant Adaptive Nearest Neighbor Classification,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 18, no. 6, pp. 607616, June 1996.
[23] T. Hastie, R. Tibshirani, and J.H. Friedman, The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, 2001.
[24] N.L. Hjort and I. Glad, “Nonparametric Density Estimation with a Parametric Start,” Annals of Statistics, vol. 23, pp. 882904, 1995.
[25] N.L. Hjort and M.C. Jones, “Locally Parametric Nonparametric Density Estimation,” Annals of Statistics, vol. 24, pp. 16191647, 1996.
[26] C.C. Holmes and N.M. Adams, “A Probabilistic Nearest Neighbor Method for Statistical Pattern Recognition,” J. Royal Statistical Soc., Series B, vol. 64, pp. 295306, 2002.
[27] C.C. Holmes and N.M. Adams, “Likelihood Inference in NearestNeighbor Classification Methods,” Biometrika, vol. 90, pp. 99112, 2003.
[28] F. Hoti and L. Holmstrom, “A Semiparametric Density Estimation Approach to Pattern Classification,” Pattern Recognition, vol. 37, pp. 409419, 2004.
[29] R.A. Johnson and D.W. Wichern, Applied Multivariate Statistical Analysis. Prentice Hall, 1992.
[30] M.C. Jones, O. Linton, and J.P. Neilsen, “A Simple and Effective Bias Reduction Method for Density and Regression Estimation,” Biometrika, vol. 82, pp. 327338, 1995.
[31] P.A. Lachenbruch and M.R. Mickey, “Estimation of Error Rates in Discriminant Analysis,” Technometrics, vol. 10, pp. 111, 1968.
[32] S.L. Lai, “Large Sample Properties of kNearest Neighbor Procedures,” PhD dissertation, Dept. of Math., Univ. of California, Los Angeles, 1977.
[33] D.O. Loftsgaarden and C.P. Quesenberry, “A Nonparametric Estimate of a Multivariate Density Function,” Annals of Math. Statistics, vol. 36, pp. 10491051, 1965.
[34] Y.P. Mack, “Local Properties of kNN Regression Estimates,” SIAM J. Algebraic and Discrete Methods, vol. 2, pp. 311323, 1981.
[35] P.C. Mahalanobis, “On the Generalized Distance in Statistics,” Proc. Nat'l Inst. of Science, vol. 12, pp. 4955, 1936.
[36] G.J. McLachlan, Discriminant Analysis and Statistical Pattern Recognition. John Wiley & Sons, 1992.
[37] I. Olkin and C.H. Spiegelman, “A Semiparametric Approach to Density Estimation,” J. Am. Statistical Assoc., vol. 82, pp. 858865, 1987.
[38] B.D. Ripley, Pattern Recognition and Neural Networks. Cambridge Univ. Press, 1996.
[39] R.E. Schapire, Y. Fruend, P. Bartlett, and W. Lee, “Boosting the Margin: A New Explanation for the Effectiveness of Voting Methods,” Annals of Statistics, vol. 26, pp. 16511686, 1998.
[40] D.W. Scott, Multivariate Density Estimation: Theory, Practice and Visualization. John Wiley & Sons, 1992.
[41] D.B. Shalak, “Prototype Selections for Composite Nearest Neighbor Classifiers,” PhD dissertation, Dept. of Computer Science, Univ. of Massachusetts, 1996.
[42] B.W. Silverman, “Weak and Strong Uniform Consistency of the Kernel Estimate of a Density Function and Its Derivatives,” Annals of Statistics, vol. 6, pp. 177184, 1978.
[43] B.W. Silverman, Density Estimation for Statistics and Data Analysis. Chapman and Hall, 1986.
[44] V.N. Vapnik, Statistical Learning Theory. John Wiley & Sons, 1998.
[45] M. Wand and M.C. Jones, Kernel Smoothing. Chapman and Hall, 1995.