The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.03 - March (2012 vol.34)
pp: 417-435
Salvador García , University of Jaen, Jaen
Joaquín Derrac , CITIC-UGR (Research Center on Information and Communications Technology), Granada
José Ramón Cano , University of Jaen, Jaen
Francisco Herrera , CITIC-UGR (Research Center on Information and Communications Technology), Granada
ABSTRACT
The nearest neighbor classifier is one of the most used and well-known techniques for performing recognition tasks. It has also demonstrated itself to be one of the most useful algorithms in data mining in spite of its simplicity. However, the nearest neighbor classifier suffers from several drawbacks such as high storage requirements, low efficiency in classification response, and low noise tolerance. These weaknesses have been the subject of study for many researchers and many solutions have been proposed. Among them, one of the most promising solutions consists of reducing the data used for establishing a classification rule (training data) by means of selecting relevant prototypes. Many prototype selection methods exist in the literature and the research in this area is still advancing. Different properties could be observed in the definition of them, but no formal categorization has been established yet. This paper provides a survey of the prototype selection methods proposed in the literature from a theoretical and empirical point of view. Considering a theoretical point of view, we propose a taxonomy based on the main characteristics presented in prototype selection and we analyze their advantages and drawbacks. Empirically, we conduct an experimental study involving different sizes of data sets for measuring their performance in terms of accuracy, reduction capabilities, and runtime. The results obtained by all the methods studied have been verified by nonparametric statistical tests. Several remarks, guidelines, and recommendations are made for the use of prototype selection for nearest neighbor classification.
INDEX TERMS
Prototype selection, nearest neighbor, taxonomy, condensation, edition, classification.
CITATION
Salvador García, Joaquín Derrac, José Ramón Cano, Francisco Herrera, "Prototype Selection for Nearest Neighbor Classification: Taxonomy and Empirical Study", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.34, no. 3, pp. 417-435, March 2012, doi:10.1109/TPAMI.2011.142
REFERENCES
[1] T.M. Cover and P.E. Hart, “Nearest Neighbor Pattern Classification,” IEEE Trans. Information Theory, vol. 13, no. 1, pp. 21-27, Jan. 1967.
[2] I. Kononenko and M. Kukar, Machine Learning and Data Mining: Introduction to Principles and Algorithms. Horwood Publishing Limited, 2007.
[3] A.N. Papadopoulos and Y. Manolopoulos, Nearest Neighbor Search: A Database Perspective. Springer, 2004.
[4] G. Shakhnarovich, T. Darrell, and P. Indyk, Nearest-Neighbor Methods in Learning and Vision: Theory and Practice, G. Shakhnarovich, T. Darrell, and P. Indyk, eds. MIT Press, 2006.
[5] X. Wu and V. Kumar, The Top Ten Algorithms in Data Mining, X. Wu and V. Kumar, eds. Chapman & Hall/CRC Data Mining and Knowledge Discovery, 2009.
[6] D.W. Aha, Lazy Learning, D.W. Aha, ed. Springer, 1997.
[7] E.K. Garcia, S. Feldman, M.R. Gupta, and S. Srivastava, “Completely Lazy Learning,” IEEE Trans. Knowledge and Data Eng., vol. 22, no. 9, pp. 1274-1285, Sept. 2010.
[8] T.M. Mitchell, Machine Learning. McGraw-Hill, 1997.
[9] A.K. Ghosh, “On Optimum Choice of k in Nearest Neighbor Classification,” Computational Statistics & Data Analysis, vol. 50, no. 11, pp. 3113-3123, 2006.
[10] T. Hastie and R. Tibshirani, “Discriminant Adaptive Nearest Neighbor Classification,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 18, no. 6, pp. 607-616, June 1996.
[11] A.K. Ghosh and K. Anil, “On Nearest Neighbor Classification Using Adaptive Choice of k,” J. Computational & Graphical Statistics, vol. 16, no. 2, pp. 482-502, 2007.
[12] C. Domeniconi, J. Peng, and D. Gunopulos, “Locally Adaptive Metric Nearest-Neighbor Classification,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 9, pp. 1281-1285, Sept. 2002.
[13] J. Yu, J. Amores, N. Sebe, P. Radeva, and Q. Tian, “Distance Learning for Similarity Estimation,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 30, no. 3, pp. 451-462, Mar. 2008.
[14] A. Argentini and E. Blanzieri, “About Neighborhood Counting Measure Metric and Minimum Risk Metric,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 32, no. 4, pp. 763-765, Apr. 2010.
[15] P. Cunningham, “A Taxonomy of Similarity Mechanisms for Case-Based Reasoning,” IEEE Trans. Knowledge and Data Eng., vol. 21, no. 11, pp. 1532-1543, Nov. 2009.
[16] C.-M. Hsu and M.-S. Chen, “On the Design and Applicability of Distance Functions in High-Dimensional Data Space,” IEEE Trans. Knowledge and Data Eng., vol. 21, no. 4, pp. 523-536, Apr. 2009.
[17] B.K. Kim and S.B. Park, “A Fast k Nearest Neighbor Finding Algorithm Based on the Ordered Partition,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 8, no. 6, pp. 761-766, Nov. 1986.
[18] S.A. Nene and S.K. Nayar, “A Simple Algorithm for Nearest Neighbor Search in High Dimensions,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 19, no. 9, pp. 989-1003, Sept. 1997.
[19] H. Samet, “K-Nearest Neighbor Finding Using Maxnearestdist,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 30, no. 2, pp. 243-252, Feb. 2008.
[20] P. Grother, G.T. Candela, and J.L. Blue, “Fast Implementations of Nearest-Neighbor Classifiers,” Pattern Recognition, vol. 30, no. 3, pp. 459-465, 1997.
[21] S. Arya, D.M. Mount, N.S. Netanyahu, R. Silverman, and A.Y. Wu, “An Optimal Algorithm for Approximate Nearest Neighbor Searching Fixed Dimensions,” J. ACM, vol. 45, no. 6, pp. 891-923, 1998.
[22] A. Andoni and P. Indyk, “Near-Optimal Hashing Algorithms for Approximate Nearest Neighbor in High Dimensions,” Comm. ACM, vol. 51, no. 1, pp. 117-122, 2008.
[23] D.R. Wilson and T.R. Martinez, “Reduction Techniques for Instance-Based Learning Algorithms,” Machine Learning, vol. 38, no. 3, pp. 257-286, 2000.
[24] N. Jankowski and M. Grochowski, “Comparison of Instances Selection Algorithms I. Algorithms Survey,” Proc. Int'l Conf. Artificial Intelligence and Soft Computing, pp. 598-603, 2004.
[25] E. Pekalska, R.P.W. Duin, and P. Paclík, “Prototype Selection for Dissimilarity-Based Classifiers,” Pattern Recognition, vol. 39, no. 2, pp. 189-208, 2006.
[26] W. Lam, C.K. Keung, and D. Liu, “Discovering Useful Concept Prototypes for Classification Based on Filtering and Abstraction,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 8, pp. 1075-1090, Aug. 2002.
[27] M. Lozano, J.M. Sotoca, J.S. Sánchez, F. Pla, E. Pekalska, and R.P.W. Duin, “Experimental Study on Prototype Optimisation Algorithms for Prototype-Based Classification in Vector Spaces,” Pattern Recognition, vol. 39, no. 10, pp. 1827-1838, 2006.
[28] F. Angiulli, “Fast Nearest Neighbor Condensation for Large Data Sets Classification,” IEEE Trans. Knowledge and Data Eng., vol. 19, no. 11, pp. 1450-1464, Nov. 2007.
[29] J.C. Bezdek and L.I. Kuncheva, “Nearest Prototype Classifier Designs: An Experimental Study,” Int'l J. Intelligent Systems, vol. 16, pp. 1445-1473, 2001.
[30] S.W. Kim and J. Oomenn, “A Brief Taxonomy and Ranking of Creative Prototype Reduction Schemes.” Pattern Analysis and Applications, vol. 6, pp. 232-244, 2003.
[31] J.-R. Cano, F. Herrera, and M. Lozano, “Stratification for Scaling up Evolutionary Prototype Selection,” Pattern Recognition Letters, vol. 26, no. 7, pp. 953-963, 2005.
[32] C.-L. Chang, “Finding Prototypes for Nearest Neighbor Classifiers,” IEEE Trans. Computers, vol. 23, no. 11, pp. 1179-1184, Nov. 1974.
[33] T. Kohonen, “The Self Organizing Map,” Proc. IEEE, vol. 78, no. 9, pp. 1464-1480, Sept. 1990.
[34] A. Cervantes, I.M. Galván, and P. Isasi, “AMPSO: A New Particle Swarm Method for Nearest Neighborhood Classification,” IEEE Trans. Systems, Man, and Cybernetics: Part B, Cybernetics, vol. 39, no. 5, pp. 1082-1091, Oct. 2009.
[35] I. Triguero, S. García, and F. Herrera, “IPADE: Iterative Prototype Adjustment for Nearest Neighbor Classification,” IEEE Trans. Neural Networks, vol. 21, no. 12, pp. 1984-1990, Dec. 2010.
[36] I. Triguero, S. García, and F. Herrera, “Differential Evolution for Optimizing the Positioning of Prototypes in Nearest Neighbor Classification,” Pattern Recognition, vol. 44, pp. 901-916, 2011.
[37] P. Domingos, “Unifying Instance-Based and Rule-Based Induction,” Machine Learning, vol. 24, no. 2, pp. 141-168, 1996.
[38] O. Luaces and A. Bahamonde, “Inflating Examples to Obtain Rules,” Int'l J. Intelligent Systems, vol. 18, pp. 1113-1143, 2003.
[39] S. García, J. Derrac, J. Luengo, C.J. Carmona, and F. Herrera, “Evolutionary Selection of Hyperrectangles in Nested Generalized Exemplar Learning,” Applied Soft Computing, vol. 11, pp. 3032-3045, 2011.
[40] D.B. Skalak, “Prototype and Feature Selection by Sampling and Random Mutation Hill Climbing Algorithms,” Proc. 11th Int'l Conf. Machine Learning, pp. 293-301, 1994.
[41] J. Derrac, S. García, and F. Herrera, “IFS-CoCo: Instance and Feature Selection Based on Cooperative Coevolution with Nearest Neighbor Rule,” Pattern Recognition, vol. 43, no. 6, pp. 2082-2105, 2010.
[42] D. Wettschereck, D.W. Aha, and T. Mohri, “A Review and Empirical Evaluation of Feature Weighting Methods for a Class of Lazy Learning Algorithms,” Artificial Intelligence Rev., vol. 11, nos. 1-5, pp. 273-314, 1997.
[43] R. Paredes and E. Vidal, “Learning Weighted Metrics to Minimize Nearest-Neighbor Classification Error,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 28, no. 7, pp. 1100-1110, July 2006.
[44] F. Fernández and P. Isasi, “Local Feature Weighting in Nearest Prototype Classification,” IEEE Trans. Neural Networks, vol. 19, no. 1, pp. 40-53, Jan. 2008.
[45] E. Alpaydin, “Voting over Multiple Condensed Nearest Neighbors,” Artificial Intelligence Rev., vol. 11, nos. 1-5, pp. 115-132, 1997.
[46] N. García-Pedrajas, “Constructing Ensembles of Classifiers by Means of Weighted Instance Selection,” IEEE Trans. Neural Networks, vol. 20, no. 2, pp. 258-277, Feb. 2009.
[47] D.R. Wilson and T.R. Martinez, “Improved Heterogeneous Distance Functions,” J. Artificial Intelligence Research, vol. 6, pp. 1-34, 1997.
[48] R. Paredes and E. Vidal, “Learning Prototypes and Distances: A Prototype Reduction Technique Based on Nearest Neighbor Error Minimization,” Pattern Recognition, vol. 39, no. 2, pp. 180-188, 2006.
[49] C. Gagné and M. Parizeau, “Coevolution of Nearest Neighbor Classifiers,” Int'l J. Pattern Recognition and Artificial Intelligence, vol. 21, no. 5, pp. 921-946, 2007.
[50] S.-W. Kim and B.J. Oommen, “On Using Prototype Reduction Schemes to Optimize Dissimilarity-Based Classification,” Pattern Recognition, vol. 40, no. 11, pp. 2946-2957, 2007.
[51] A. Haro-García and N. García-Pedrajas, “A Divide-and-Conquer Recursive Approach for Scaling Up Instance Selection Algorithms,” Data Mining and Knowledge Discovery, vol. 18, no. 3, pp. 392-418, 2009.
[52] C. García-Osorio, A. de Haro-García, and N. García-Pedrajas, “Democratic Instance Selection: A Linear Complexity Instance Selection Algorithm Based on Classifier Ensemble Concepts,” Artificial Intelligence, vol. 174, nos. 5/6, pp. 410-441, 2010.
[53] F. Angiulli and G. Folino, “Distributed Nearest Neighbor-Based Condensation of Very Large Data Sets,” IEEE Trans. Knowledge and Data Eng., vol. 19, no. 12, pp. 1593-1606, Dec. 2007.
[54] J.-R. Cano, F. Herrera, and M. Lozano, “Evolutionary Stratified Training Set Selection for Extracting Classification Rules with Trade Off Precision-Interpretability,” Data and Knowledge Eng., vol. 60, no. 1, pp. 90-108, 2007.
[55] K.J. Kim, “Artificial Neural Networks with Evolutionary Instance Selection for Financial Forecasting,” Expert Systems with Applications, vol. 30, no. 3, pp. 519-526, 2006.
[56] J.-R. Cano, F. Herrera, M. Lozano, and S. García, “Making CN2-SD Subgroup Discovery Algorithm Scalable to Large Size Data Sets Using Instance Selection,” Expert Systems with Applications, vol. 35, no. 4, pp. 1949-1965, 2008.
[57] J.-R. Cano, S. García, and F. Herrera, “Subgroup Discover in Large Size Data Sets Preprocessed Using Stratified Instance Selection for Increasing the Presence of Minority Classes,” Pattern Recognition Letters, vol. 29, no. 16, pp. 2156-2164, 2008.
[58] Y. Chen, J. Bi, and J.Z. Wang, “MILES: Multiple-Instance Learning via Embedded Instance Selection,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 28, no. 12, pp. 1931-1947, Dec. 2006.
[59] Z. Fu, A. Robles-Kelly, and J. Zhou, “MILIS: Multiple Instance Learning with Instance Selection,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 33, no. 5, pp. 958-977, May 2011.
[60] N.V. Chawla, D.A. Cieslak, L.O. Hall, and A. Joshi, “Automatically Countering Imbalance and Its Empirical Relationship to Cost,” Data Mining and Knowledge Discovery, vol. 17, no. 2, pp. 225-252, 2008.
[61] G.E.A.P.A. Batista, R.C. Prati, and M.C. Monard, “A Study of the Behavior of Several Methods for Balancing Machine Learning Training Data,” ACM SIGKDD Explorations Newsletter, vol. 6, no. 1, pp. 20-29, 2004.
[62] S. García and F. Herrera, “Evolutionary Under-Sampling for Classification with Imbalanced Data Sets: Proposals and Taxonomy,” Evolutionary Computation, vol. 17, no. 3, pp. 275-306, 2009.
[63] S.-W. Kim and B.J. Oommen, “On Using Prototype Reduction Schemes to Enhance the Computation of Volume-Based Inter-Class Overlap Measures,” Pattern Recognition, vol. 42, no. 11, pp. 2695-2704, 2009.
[64] R.A. Mollineda, J.S. Sánchez, and J.M. Sotoca, “Data Characterization for Effective Prototype Selection,” Proc. Second Iberian Conf. Pattern Recognition and Image Analysis, pp. 27-34, 2005.
[65] S. García, J.-R. Cano, E. Bernadó-Mansilla, and F. Herrera, “Diagnose of Effective Evolutionary Prototype Selection Using an Overlapping Measure,” Int'l J. Pattern Recognition and Artificial Intelligence, vol. 23, no. 8, pp. 1527-1548, 2009.
[66] P.E. Hart, “The Condensed Nearest Neighbor Rule,” IEEE Trans. Information Theory, vol. 14, no. 3, pp. 515-516, May 1968.
[67] G.W. Gates, “The Reduced Nearest Neighbor Rule,” IEEE Trans. Information Theory, vol. 18, no. 3, pp. 431-433, May 1972.
[68] D.L. Wilson, “Asymptotic Properties of Nearest Neighbor Rules Using Edited Data,” IEEE Trans. Systems, Man, and Cybernetics, vol. 2, no. 3, pp. 408-421, July 1972.
[69] J.R. Ullmann, “Automatic Selection of Reference Data for Use in a Nearest-Neighbor Method of Pattern Classification,” IEEE Trans. Information Theory, vol. 20, no. 4, pp. 541-543, July 1974.
[70] G.L. Ritter, H.B. Woodruff, S.R. Lowry, and T.L. Isenhour, “An Algorithm for a Selective Nearest Neighbor Decision Rule,” IEEE Trans. Information Theory, vol. 21, no. 6, pp. 665-669, Nov. 1975.
[71] I. Tomek, “An Experiment with the Edited Nearest-Neighbor Rule,” IEEE Trans. Systems, Man, and Cybernetics, vol. 6, no. 6, pp. 448-452, June 1976.
[72] I. Tomek, “Two Modifications of CNN,” IEEE Trans. Systems, Man, and Cybernetics, vol. 6, no. 6, pp. 769-772, Nov. 1976.
[73] K.C. Gowda and G. Krishna, “The Condensed Nearest Neighbor Rule Using the Concept of Mutual Nearest Neighborhood,” IEEE Trans. Information Theory, vol. 25, no. 4, pp. 488-490, July 1979.
[74] P.A. Devijver and J. Kittler, Pattern Recognition, A Statistical Approach. Prentice Hall, 1982.
[75] P.A. Devijver, “On the Editing Rate of The Multiedit Algorithm,” Pattern Recognition Letters, vol. 4, pp. 9-12, 1986.
[76] D. Kibler and D.W. Aha, “Learning Representative Exemplars of Concepts: An Initial Case Study,” Proc. Fourth Int'l Workshop Machine Learning, pp. 24-30, 1987.
[77] D.W. Aha, D. Kibler, and M.K. Albert, “Instance-Based Learning Algorithms,” Machine Learning, vol. 6, no. 1, pp. 37-66, 1991.
[78] B.V. Dasarathy, “Minimal Consistent Set (MCS) Identification for Optimal Nearest Neighbor Decision System Design,” IEEE Trans. Systems, Man, and Cybernetics, vol. 24, no. 3, pp. 511-517, Mar. 1994.
[79] R.M. Cameron-Jones, “Instance Selection by Encoding Length Heuristic with Random Mutation Hill Climbing,” Proc. Eighth Australian Joint Conf. Artificial Intelligence, pp. 99-106, 1995.
[80] C.E. Brodley, “Recursive Automatic Bias Selection for Classifier Construction,” Machine Learning, vol. 20, nos. 1/2, pp. 63-94, 1995.
[81] D.G. Lowe, “Similarity Metric Learning for a Variable-Kernel Classifier,” Neural Computation, vol. 7, no. 1, pp. 72-85, 1995.
[82] J.S. Sánchez, F. Pla, and F.J. Ferri, “Prototype Selection for the Nearest Neighbor Rule through Proximity Graphs,” Pattern Recognition Letters, vol. 18, pp. 507-513, 1997.
[83] U. Lipowezky, “Selection of the Optimal Prototype Subset for 1-nn Classification,” Pattern Recognition Letters, vol. 19, no. 10, pp. 907-918, 1998.
[84] L.I. Kuncheva, “Editing for the k-Nearest Neighbors Rule by a Genetic Algorithm,” Pattern Recognition Letters, vol. 16, no. 8, pp. 809-814, 1995.
[85] L.I. Kuncheva and L.C. Jain, “Nearest Neighbor Classifier: Simultaneous Editing and Feature Selection,” Pattern Recognition Letters, vol. 20, nos. 11-13, pp. 1149-1156, 1999.
[86] K. Hattori and M. Takahashi, “A New Edited K-Nearest Neighbor Rule in the Pattern Classification Problem,” Pattern Recognition, vol. 33, no. 3, pp. 521-528, 2000.
[87] B. Sierra, E. Lazkano, I. Inza, M. Merino, P. Larrañaga, and J. Quiroga, “Prototype Selection and Feature Subset Selection by Estimation of Distribution Algorithms, A Case Study in the Survival of Cirrhotic Patients Treated with TIPS,” Proc. Eighth Conf. AI in Medicine in Europe, pp. 20-29, 2001.
[88] V. Cerverón and F.J. Ferri, “Another Move toward the Minimum Consistent Subset: A Tabu Search Approach to the Condensed Nearest Neighbor Rule,” IEEE Trans. Systems, Man, and Cybernetics, Part B, vol. 31, no. 3, pp. 408-413, June 2001.
[89] H. Brighton and C. Mellish, “Advances in Instance Selection for Instance-Based Learning Algorithms,” Data Mining and Knowledge Discovery, vol. 6, no. 2, pp. 153-172, 2002.
[90] V.S. Devi and M.N. Murty, “An Incremental Prototype Set Building Technique,” Pattern Recognition, vol. 35, no. 2, pp. 505-513, 2002.
[91] S.-Y. Ho, C.-C. Liu, and S. Liu, “Design of an Optimal Nearest Neighbor Classifier Using an Intelligent Genetic Algorithm,” Pattern Recognition Letters, vol. 23, no. 13, pp. 1495-1503, 2002.
[92] M. Sebban and R. Nock, “Instance Pruning as an Information Preserving Problem,” Proc. 17th Int'l Conf. Machine Learning, pp. 855-862, 2000.
[93] M. Sebban, R. Nock, E. Brodley, and A. Danyluk, “Stopping Criterion for Boosting-Based Data Reduction Techniques: From Binary to Multiclass Problems,” J. Machine Learning Research, vol. 3, pp. 863-885, 2002.
[94] Y. Wu, K.G. Ianakiev, and V. Govindaraju, “Improved K-Nearest Neighbor Classification,” Pattern Recognition, vol. 35, no. 10, pp. 2311-2318, 2002.
[95] H. Zhang and G. Sun, “Optimal Reference Subset Selection for Nearest Neighbor Classification by Tabu Search,” Pattern Recognition, vol. 35, no. 7, pp. 1481-1490, 2002.
[96] M.T. Lozano, J.S. Sánchez, and F. Pla, “Using the Geometrical Distribution of Prototypes for Training Set Condensing,” Proc. Conf. Spanish Assoc. for Artificial Intelligence, pp. 618-627, 2003.
[97] K.P. Zhao, S.G. Zhou, J.H. Guan, and A.Y. Zhou, “C-Pruner: An Improved Instance Pruning Algorithm,” Proc. Second Int'l Conf. Machine Learning and Cybernetics, pp. 94-99, 2003.
[98] J.-R. Cano, F. Herrera, and M. Lozano, “Using Evolutionary Algorithms as Instance Selection for Data Reduction in KDD: An Experimental Study,” IEEE Trans. Evolutionary Computation, vol. 7, no. 6, pp. 561-575, Dec. 2003.
[99] J.C. Riquelme, J.S. Aguilar-Ruiz, and M. Toro, “Finding Representative Patterns with Ordered Projections,” Pattern Recognition, vol. 36, no. 4, pp. 1009-1018, 2003.
[100] J.S. Sánchez, R. Barandela, A.I. Marqués, R. Alejo, and J. Badenas, “Analysis of New Techniques to Obtain Quality Training Sets,” Pattern Recognition Letters, vol. 24, no. 7, pp. 1015-1022, 2003.
[101] F. Vázquez, J.S. Sánchez, and F. Pla, “A Stochastic Approach to Wilson's Editing Algorithm,” Proc. Second Iberian Conf. Pattern Recognition and Image Analysis, pp. 35-42, 2005.
[102] Y. Li, Z. Hu, Y. Cai, and W. Zhang, “Support Vector Based Prototype Selection Method for Nearest Neighbor Rules,” Proc. First Int'l Conf. Advances in Natural Computation, pp. 528-535, 2005.
[103] J.A. Olvera-López, J.F. Martínez-Trinidad, and J.A. Carrasco-Ochoa, “Edition Schemes Based on BSE,” Proc. 10th Iberoam. Congress on Pattern Recognition, pp. 360-367, 2005.
[104] R. Barandela, F.J. Ferri, and J.S. Sánchez, “Decision Boundary Preserving Prototype Selection for Nearest Neighbor Classification,” Int'l J. Pattern Recognition and Artificial Intelligence, vol. 19, no. 6, pp. 787-806, 2005.
[105] F. Chang, C.-C. Lin, and C.-J. Lu, “Adaptive Prototype Learning Algorithms: Theoretical and Experimental Studies,” J. Machine Learning Research, vol. 7, pp. 2125-2148, 2006.
[106] X.Z. Wang, B. Wu, Y.L. He, and X.H. Pei, “NRMCS: Noise Removing Based on the MCS,” Proc. Seventh Int'l Conf. Machine Learning and Cybernetics, pp. 89-93, 2008.
[107] R. Gil-Pita and X. Yao, “Evolving Edited K-Nearest Neighbor Classifiers,” Int'l J. Neural Systems, vol. 18, no. 6, pp. 459-467, 2008.
[108] S. García, J.R. Cano, and F. Herrera, “A Memetic Algorithm for Evolutionary Prototype Selection: A Scaling Up Approach,” Pattern Recognition, vol. 41, no. 8, pp. 2693-2709, 2008.
[109] E. Marchiori, “Hit Miss Networks with Applications to Instance Selection,” J. Machine Learning Research, vol. 9, pp. 997-1017, 2008.
[110] H.A. Fayed and A.F. Atiya, “A Novel Template Reduction Approach for the K-Nearest Neighbor Method,” IEEE Trans. Neural Networks, vol. 20, no. 5, pp. 890-896, May 2009.
[111] J.A. Olvera-López, J.A. Carrasco-Ochoa, and J.F. Martínez-Trinidad, “A New Fast Prototype Selection Method Based on Clustering,” Pattern Analysis and Applications, vol. 13, no. 2, pp. 131-141, 2010.
[112] E. Marchiori, “Class Conditional Nearest Neighbor for Large Margin Instance Selection,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 32, no. 2, pp. 364-370, Feb. 2010.
[113] N. García-Pedrajas, J.A. Romero Del Castillo, and D. Ortiz-Boyer, “A Cooperative Coevolutionary Algorithm for Instance Selection for Instance-Based Learning,” Machine Learning, vol. 78, no. 3, pp. 381-420, 2010.
[114] J. Alcalá-Fdez, L. Sánchez, S. García, M.J. del Jesus, S. Ventura, J.M. Garrell, J. Otero, C. Romero, J. Bacardit, V.M. Rivas, J.C. Fernández, and F. Herrera, “KEEL: A Software Tool to Assess Evolutionary Algorithms for Data Mining Problems,” Soft Computing, vol. 13, no. 3, pp. 307-318, 2009.
[115] A. Asuncion and D. Newman, “UCI Machine Learning Repository,” http://www.ics.uci.edu/mlearnMLRepository.html , 2007.
[116] J.A. Cohen, “Coefficient of Agreement for Nominal Scales,” Educational and Psychological Measurement, vol. 20, pp. 37-46, 1960.
[117] I.H. Witten and E. Frank, Data Mining: Practical Machine Learning Tools and Techniques. Morgan Kaufmann, 2005.
[118] A. Ben-David, “A Lot of Randomness Is Hiding in Accuracy,” Eng. Applications of Artificial Intelligence, vol. 20, pp. 875-885, 2007.
[119] J. Demšar, “Statistical Comparisons of Classifiers over Multiple Data Sets,” J. Machine Learning Research, vol. 7, pp. 1-30, 2006.
[120] S. García and F. Herrera, “An Extension on 'Statistical Comparisons of Classifiers over Multiple Data Sets' for All Pairwise Comparisons,” J. Machine Learning Research, vol. 9, pp. 2677-2694, 2008.
[121] S. García, A. Fernández, J. Luengo, and F. Herrera, “Advanced Nonparametric Tests for Multiple Comparisons in the Design of Experiments in Computational Intelligence and Data Mining: Experimental Analysis of Power,” Information Sciences, vol. 180, no. 10, pp. 2044-2064, 2010.
[122] F. Wilcoxon, “Individual Comparisons by Ranking Methods,” Biometrics Bull., vol. 1, pp. 80-83, 1945.
79 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool