
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
Cor J. Veenman, Marcel J.T. Reinders, "The Nearest Subclass Classifier: A Compromise between the Nearest Mean and Nearest Neighbor Classifier," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, no. 9, pp. 14171429, September, 2005.  
BibTex  x  
@article{ 10.1109/TPAMI.2005.187, author = {Cor J. Veenman and Marcel J.T. Reinders}, title = {The Nearest Subclass Classifier: A Compromise between the Nearest Mean and Nearest Neighbor Classifier}, journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence}, volume = {27}, number = {9}, issn = {01628828}, year = {2005}, pages = {14171429}, doi = {http://doi.ieeecomputersociety.org/10.1109/TPAMI.2005.187}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Pattern Analysis and Machine Intelligence TI  The Nearest Subclass Classifier: A Compromise between the Nearest Mean and Nearest Neighbor Classifier IS  9 SN  01628828 SP1417 EP1429 EPD  14171429 A1  Cor J. Veenman, A1  Marcel J.T. Reinders, PY  2005 KW  Index Terms Classification KW  regularization KW  crossvalidation KW  prototype selection. VL  27 JA  IEEE Transactions on Pattern Analysis and Machine Intelligence ER   
[1] D.W. Aha, D. Kibler, and M.K. Albert, “InstanceBased Learning Algorithms,” Machine Learning, vol. 6, pp. 3766, 1991.
[2] C. Ambroise and G.J. McLachlan, “Selection Bias in Gene Extraction on the Basis of Microarray GeneExpression Data,” Proc. Nat'l Academy of Sciences (PNAS), vol. 99, no. 10, pp. 65626566, May 2002.
[3] G.H. Ball and D.J. Hall, “A Clustering Technique for Summarizing Multivariate Data,” Behavioral Science, vol. 12, pp. 153155, Mar. 1967.
[4] J.C. Bezdek, Pattern Recognition with Fuzzy Objective Function Algorithms. New York: Plenum Press, 1981.
[5] J.C. Bezdek and L.I. Kuncheva, “Nearest Prototype Classifier Designs: An Experimental Study,” Int'l J. Intelligent Systems, vol. 16, pp. 14451473, 2001.
[6] J.C. Bezdek and N.R. Pal, “Some New Indexes of Cluster Validity,” IEEE Trans. Systems, Man, and CyberneticsPart B, vol. 28, no. 3, pp. 301315, 1998.
[7] J.C. Bezdek, T.R. Reichherzer, G.S. Lim, and Y. Attikiouzel, “MultiplePrototype Classifier Design,” IEEE Trans. Systems, Man, and CyberneticsPart C: Applications and Reviews, vol. 28, no. 1, pp. 6779, 1998.
[8] C.L. Blake and C.J. Merz, UCI Repository of Machine Learning Databases, Univ. of California, Irvine, 1998.
[9] H. Brighton and C. Mellish, “Advances in Instance Selection for InstanceBased Learning Algorithms,” Data Mining and Knowledge Discovery, vol. 6, pp. 153172, 2002.
[10] V. Cerverón and F.J. Ferri, “Another Move Toward the Minimum Consistent Subset: A Tabu Search Approach to the Condensed Nearest Neighbor Rule,” IEEE Trans. Systems, Man, and CyberneticsPart B: Cybernetics, vol. 31, no. 3, pp. 408413, 2001.
[11] C.L. Chang, “Finding Prototypes for Nearest Neighbor Classifiers,” IEEE Trans. Computers, vol. 23, no. 11, pp. 11791184, Nov. 1974.
[12] D. Chaudhuri, C.A. Murthy, and B.B. Chaudhuri, “Finding a Subset of Representative Points in a Data Set,” IEEE Trans. Systems, Man, and Cybernetics, vol. 24, no. 9, pp. 14161424, 1994.
[13] T.M. Cover and P.E. Hart, “Nearest Neighbor Pattern Classification,” IEEE Trans. Information Theory, vol. 13, no. 1, pp. 2127, 1967.
[14] Nearest Neighbor NN Norms: NN Pattern Classification Techniques, B.V. Dasarathy, ed. Los Alamitos, Calif.: IEEE Computer Society Press, 1991.
[15] B.V. Dasarathy, “Minimal Consistent Set (MCS) Identification for Optimal Nearest Neighbor Decision Systems Design,” IEEE Trans. Systems, Man, and Cybernetics, vol. 24, no. 3, pp. 511517, 1994.
[16] B.V. Dasarathy, J.S. Sánchez, and S. Townsend, “Nearest Neighbor Editing and Condensing ToolsSynergy Exploitation,” Pattern Analysis and Applications, vol. 3, no. 1, pp. 1930, 2000.
[17] D.L. Davies and D.W. Bouldin, “A Cluster Separation Measure,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 1, no. 2, pp. 224227, Apr. 1979.
[18] J.C. Dunn, “Well Separated Clusters and Optimal Fuzzy Partitions,” J. Cybernetics, vol. 4, pp. 95104, 1974.
[19] B. Efron and R. Tibshirani, “Improvements on CrossValidation: The .632+ Bootstrap Method,” J. Am. Statistical Assoc., vol. 92, no. 438, pp. 548560, 1997.
[20] R.A. Fisher, “The Use of Multiple Measurements in Taxonomic Problems,” Annals of Eugenics, vol. 7, no. 2, pp. 179188, 1936.
[21] E. Fix and J.L. Hodges, “Discriminatory Analysis: Nonparametric Discrimination: Consistency Properties,” USAF School of Aviation Medicine, Project 2149004 (Report Number 4), pp. 261279, 1951.
[22] V. Ganti, J. Gehrke, and R. Ramakrishnan, “Mining Very Large Databases,” Computer, vol. 32, no. 8, pp. 3845, Aug. 1999.
[23] G.W. Gates, “The Reduced Nearest Neighbor Rule,” IEEE Trans. Information Theory, vol. 18, no. 3, pp. 431433, 1972.
[24] S. Geman, E. Bienenstock, and R. Doursat, “Neural Networks and the Bias/Variance Dilemma,” Neural Computation, vol. 4, pp. 158, 1992.
[25] F. Glover and M. Laguna, Tabu Search. Boston: Kluwer Academic, 1997.
[26] Y. Hamamoto, S. Uchimura, and S. Tomita, “A Bootstrap Technique for Nearest Neighbor Classifier Design,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 19, no. 1, pp. 7379, Jan. 1997.
[27] P.E. Hart, “The Condensed Nearest Neighbor Rule,” IEEE Trans. Information Theory, vol. 14, no. 3, pp. 515516, 1968.
[28] T. Hastie, R. Tibshirani, and J. Friedman, The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer, 2001.
[29] R.J. Henery, Machine Learning, Neural and Statistical Classification, chapter 7, pp. 107124. Ellis Horwood, 1994.
[30] A.E. Hoerl and R.W. Kennard, “Ridge Regression: Biased Estimation for Nonorthogonal Problems,” Technometrics, vol. 12, no. 1, pp. 5567, 1970.
[31] L.J. Hubert and P. Arabie, “Comparing Partitions,” J. Classification, vol. 2, pp. 193218, 1985.
[32] A.K. Jain and R.C. Dubes, Algorithms for Clustering Data. New Jersey: PrenticeHall Inc., 1988.
[33] A.K. Jain and D.E. Zongker, “Feature Selection: Evaluation, Application, and Small Sample Performance,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 19, no. 2, pp. 153158, Feb. 1997.
[34] R. Kohavi and G.H. John, “Wrappers for Feature Subset Selection,” Artificial Intelligence, vol. 97, pp. 273324, Dec. 1997.
[35] T. Kohonen, “Improved Versions of Learning Vector Quantization,” Proc. Int'l Joint Conf. Neural Networks, vol. 1, pp. 545550, 1990.
[36] L.I. Kuncheva and J.C. Bezdek, “Presupervised and Postsupervised Prototype Classifier Design,” IEEE Trans. Neural Networks, vol. 10, no. 5, pp. 11421152, Sept. 1999.
[37] L.I. Kuncheva and J.C. Bezdek, “Nearest Prototype Classification: Clustering, Genetic Algorithms, or Random Search,” IEEE Trans. Systems, Man, and Cybernetics, vol. 28, no. 1, pp. 160164, 1998.
[38] W. Lam, C.K Keung, and C.X. Ling, “Learning Good Prototypes for Classification Using Filtering and Abstraction of Instances,” Pattern Recognition, vol. 35, no. 7, pp. 14911506, July 2002.
[39] W. Lam, C.K Keung, and D. Liu, “Discovering Useful Concept Prototypes for Classification Based on Filtering and Abstraction,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 8, pp. 10751090, Aug. 2002.
[40] P. Mitra, C.A. Murthy, and S.K. Pal, “DensityBased Multiscale Data Condensation,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 6, pp. 734747, June 2002.
[41] R.A. Mollineda, F.J. Ferri, and E. Vidal, “An Efficient Prototype Merging Strategy for the Condensed 1NN Rule Through ClassConditional Hierarchical Clustering,” Pattern Recognition, vol. 35, pp. 27712782, 2002.
[42] G.L. Ritter, H.B. Woodruff, S.R. Lowry, and T.L. Isenhour, “An Algorithm for a Selective Nearest Neighbor Decision Rule,” IEEE Trans. Information Theory, vol. 21, no. 6, pp. 665669, 1975.
[43] C. Schaffer, “Overfitting Avoidance as Bias,” Machine Learning, vol. 10, pp. 153178, 1993.
[44] V.G. Sigillito, S.P. Wing, L.V. Hutton, and K.B. Baker, “Classification of Radar Returns from the Ionosphere Using Neural Networks,” John Hopkins APL Technical Digest, vol. 10, pp. 262266, 1989.
[45] D.B. Skalak, “Prototype and Feature Selection by Sampling and Random Mutation Hill Climbing Algorithms,” Proc. 11th Int'l Conf. Machine Learning, pp. 293301, 1994.
[46] C.W. Swonger, “Sample Set Condensation for a Condensed Nearest Neighbor Decision Rule for Pattern Recognition,” Frontiers of Pattern Recognition, pp. 511519, 1972.
[47] I. Tomek, “An Experiment with the Edited NearestNeighbor Rule,” IEEE Trans. Systems, Man, and Cybernetics, vol. 6, no. 6, pp. 448452, 1976.
[48] C.J. Veenman, M.J.T. Reinders, and E. Backer, “A Maximum Variance Cluster Algorithm,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 9, pp. 12731280, Sept. 2002.
[49] C.J. Veenman, M.J.T. Reinders, and E. Backer, “A Cellular Coevolutionary Algorithm for Image Segmentation,” IEEE Trans. Image Processing, vol. 12, no. 3, pp. 304316, Mar. 2003.
[50] G. Wilfong, “Nearest Neighbor Problems,” Proc. Seventh Ann. Symp. Computational Geometry, pp. 224233, 1991.
[51] D.L. Wilson, “Assymptotic Properties of Nearest Neighbor Rules Using Edited Data,” IEEE Trans. Systems, Man, and Cybernetics, vol. 2, no. 3, pp. 408421, July 1972.
[52] D.R. Wilson and T.R. Martinez, “Instance Pruning Techniques,” Proc. 14th Int'l Conf. Machine Learning, pp. 404411, 1997.
[53] D.R. Wilson and T.R. Martinez, “Reduction Techniques for InstanceBased Learning Algorithms,” Machine Learning, vol. 38, pp. 257286, 2000.
[54] W.H. Wolberg and O.L. Mangarasian, “Multisurface Method of Pattern Separation for Medical Diagnoses Applied to Breast Cytology,” Proc. Nat'l Academy of Sciences, vol. 87, pp. 91939196, Dec. 1990.