
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
Fabrizio Angiulli, "Fast Nearest Neighbor Condensation for Large Data Sets Classification," IEEE Transactions on Knowledge and Data Engineering, vol. 19, no. 11, pp. 14501464, November, 2007.  
BibTex  x  
@article{ 10.1109/TKDE.2007.190645, author = {Fabrizio Angiulli}, title = {Fast Nearest Neighbor Condensation for Large Data Sets Classification}, journal ={IEEE Transactions on Knowledge and Data Engineering}, volume = {19}, number = {11}, issn = {10414347}, year = {2007}, pages = {14501464}, doi = {http://doi.ieeecomputersociety.org/10.1109/TKDE.2007.190645}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Knowledge and Data Engineering TI  Fast Nearest Neighbor Condensation for Large Data Sets Classification IS  11 SN  10414347 SP1450 EP1464 EPD  14501464 A1  Fabrizio Angiulli, PY  2007 KW  Clustering KW  classification KW  and association rules KW  Data mining VL  19 JA  IEEE Transactions on Knowledge and Data Engineering ER   
[1] D.W. Aha, “Editorial,” Artificial Intelligence Rev., special issue on lazy learning, vol. 11, nos. 15, pp. 710, 1997.
[2] D.W. Aha, D. Kibler, and M.K. Albert, “InstanceBased Learning Algorithms,” Machine Learning, vol. 6, pp. 3766, 1991.
[3] E. Alpaydin, “Voting over Multiple Condensed Nearest Neighbors,” Artificial Intelligence Rev., vol. 11, pp. 115132, 1997.
[4] F. Angiulli, “Fast Condensed Nearest Neighbor Rule,” Proc. 22nd Int'l Conf. Machine Learning (ICML '05), pp. 2532, 2005.
[5] S. Bay, “Combining Nearest Neighbor Classifiers through Multiple Feature Subsets,” Proc. 15th Int'l Conf. Machine Learning (ICML '98), 1998.
[6] S. Bay, “Nearest Neighbor Classification from Multiple Feature Sets,” Intelligent Data Analysis, vol. 3, pp. 191209, 1999.
[7] B. Bhattacharya and D. Kaller, “Reference Set Thinning for the $k$ Nearest Neighbor Decision Rule,” Proc. 14th Int'l Conf. Pattern Recognition (ICPR '98), 1998.
[8] H. Brighton and C. Mellish, “Advances in Instance Selection for InstanceBased Learning Algorithms,” Data Mining and Knowledge Discovery, vol. 6, no. 2, pp. 153172, 2002.
[9] R.M. CameronJones, “Instance Selection by Encoding Length Heuristic with Random Mutation Hill Climbing,” Proc. Eighth Australian Joint Conf. Artificial Intelligence, pp. 99106, 1995.
[10] T.M. Cover and P.E. Hart, “Nearest Neighbor Pattern Classification,” IEEE Trans. Information Theory, vol. 13, no. 1, pp. 2127, 1967.
[11] B. Dasarathy, Nearest Neighbor (NN) NormsNN Pattern Classification Techniques. IEEE CS Press, 1991.
[12] B. Dasarathy, “Minimal Consistent Subset (MCS) Identification for Optimal Nearest Neighbor Decision Systems Design,” IEEE Trans. Systems, Man, and Cybernetics, vol. 24, no. 3, pp. 511517, 1994.
[13] B. Dasarathy, “Nearest Unlike Neighbor (NUN): An Aid to Decision Confidence Estimation,” Optical Eng., vol. 34, pp. 27852792, 1995.
[14] F.S. Devi and M.N. Murty, “An Incremental Prototype Set Building Technique,” Pattern Recognition, vol. 35, no. 2, pp. 505513, 2002.
[15] P. Devijver and J. Kittler, “On the Edited Nearest Neighbor Rule,” Proc. Fifth Int'l Conf. Pattern Recognition (ICPR '80), pp. 7280, 1980.
[16] L. Devroye, “On the Inequality of Cover and Hart in Nearest Neighbor Discrimination,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 3, pp. 7578, 1981.
[17] L. Devroye, L. Gyorfy, and G. Lugosi, A Probabilistic Theory of Pattern Recognition. Springer, 1996.
[18] K. Fukunaga and L.D. Hostetler, “$k$ NearestNeighbor BayesRisk Estimation,” IEEE Trans. Information Theory, vol. 21, pp. 285293, 1975.
[19] V. Gaede and O. Günther, “Multidimensional Access Methods,” ACM Computing Surveys, vol. 30, no. 2, pp. 170231, 1998.
[20] W. Gates, “The Reduced Nearest Neighbor Rule,” IEEE Trans. Information Theory, vol. 18, no. 3, pp. 431433, 1972.
[21] P.E. Hart, “The Condensed Nearest Neighbor Rule,” IEEE Trans. Information Theory, vol. 14, no. 3, pp. 515516, 1968.
[22] B. Karaçali and H. Krim, “Fast Minimization of Structural Risk by Nearest Neighbor Rule,” IEEE Trans. Neural Networks, vol. 14, no. 1, pp. 127134, 2003.
[23] Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner, “GradientBased Learning Applied to Document Recognition,” Proc. IEEE, vol. 86, no. 11, pp. 22782324, 1998.
[24] C.L. Liu and M. Nakagawa, “Evaluation of Prototype Learning Algorithms for NearestNeighbor Classifier in Application to Handwritten Character Recognition,” Pattern Recognition, vol. 34, no. 3, pp. 601615, 2001.
[25] G.L. Ritter, H.B. Woodruff, S.R. Lowry, and T.L. Isenhour, “An Algorithm for a Selective Nearest Neighbor Decision Rule,” IEEE Trans. Information Theory, vol. 21, pp. 665669, 1975.
[26] C. Stanfill and D. Waltz, “Towards MemoryBased Reasoning,” Comm. ACM, vol. 29, pp. 12131228, 1994.
[27] C. Stone, “Consistent Nonparametric Regression,” Annals of Statistics, vol. 8, pp. 13481360, 1977.
[28] G. Toussaint, “Proximity Graphs for Nearest Neighbor Decision Rules: Recent Progress,” Proc. 34th Symp. Interface of Computing Science and Statistics (Interface '02), Apr. 2002.
[29] I. Watson and F. Marir, “CaseBased Reasoning: A Review,” The Knowledge Eng. Rev., vol. 9, no. 4, 1994.
[30] G. Wilfong, “Nearest Neighbor Problems,” Int'l J. Computational Geometry & Applications, vol. 2, no. 4, pp. 383416, 1992.
[31] D.L. Wilson, “Asymptotic Properties of Nearest Neighbor Rules Using Edited Data,” IEEE Trans. Systems, Man, and Cybernetics, vol. 2, pp. 408420, 1972.
[32] D.R. Wilson and T.R. Martinez, “Reduction Techniques for InstanceBased Learning Algorithms,” Machine Learning, vol. 38, no. 3, pp. 257286, 2000.