
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
Yuhua Li, Liam Maguire, "Selecting Critical Patterns Based on Local Geometrical and Statistical Information," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 33, no. 6, pp. 11891201, June, 2011.  
BibTex  x  
@article{ 10.1109/TPAMI.2010.188, author = {Yuhua Li and Liam Maguire}, title = {Selecting Critical Patterns Based on Local Geometrical and Statistical Information}, journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence}, volume = {33}, number = {6}, issn = {01628828}, year = {2011}, pages = {11891201}, doi = {http://doi.ieeecomputersociety.org/10.1109/TPAMI.2010.188}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Pattern Analysis and Machine Intelligence TI  Selecting Critical Patterns Based on Local Geometrical and Statistical Information IS  6 SN  01628828 SP1189 EP1201 EPD  11891201 A1  Yuhua Li, A1  Liam Maguire, PY  2011 KW  Pattern selection KW  data reduction KW  border pattern KW  edge pattern. VL  33 JA  IEEE Transactions on Pattern Analysis and Machine Intelligence ER   
[1] http://www.supportvector.netsoftware.html , 2009.
[2] ftp://ftp.dice.ucl.ac.be/pub/neuralnets/ ELENA/databases/REALphoneme/, 2009.
[3] S. Ajoka, S. Tsuge, M. Shishibori, and K. Kita, “Fast Multidimensional Nearest Neighbor Search Algorithm Using Priority Queue,” Electrical Eng. in Japan, vol. 164, no. 3, pp. 6977, 2008.
[4] F. Angiulli, “Fast Nearest Neighbor Condensation for Large Data Sets Classification,” IEEE Trans. Knowledge and Data Eng., vol. 19, no. 11, pp. 14501464, Nov. 2007.
[5] F. Angiulli, “Condensed Nearest Neighbor Data Domain Description,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 29, no. 10, pp. 17461758, Oct. 2007.
[6] R. Barandela, F.J. Ferri, and J.S. Sanchez, “Decision Boundary Preserving Prototype Selection for Nearest Neighbour Classification,” Int'l J. Pattern Recognition and Artificial Intelligence, vol. 19, no. 6, pp. 787806, Sept. 2005.
[7] J.L. Bentley and J.H. Friedman, “DataStructures for Range Searching,” ACM Computing Surveys, vol. 11, no. 4, pp. 397409, 1979.
[8] J.C. Bezdek and L.I. Kuncheva, “Nearest Prototype Classifier Designs: An Experimental Study,” Int'l J. Intelligent Systems, vol. 16, no. 12, pp. 14451473, Dec. 2001.
[9] D.S. Broomhead and D. Lowe, “Multivariable Function Interpolation and Adaptive Networks,” Complex Systems, vol. 2, pp. 321335, 1988.
[10] C.J.C. Burges, “A Tutorial on Support Vector Machines for Pattern Recognition,” Data Mining and Knowledge Discovery, vol. 2, pp. 121167, 1998.
[11] C. Cachin, “Pedagogical Pattern SelectionStrategies,” Neural Networks, vol. 7, no. 1, pp. 175181, 1994.
[12] G.C. Cawley and N.L.C. Talbot, “Efficient LeaveOneOut CrossValidation of Kernel Fisher Discriminant Classifiers,” Pattern Recognition, vol. 36, pp. 25852592, 2003.
[13] V. Cerverón and F.J. Ferri, “Another Move Toward the Minimum Consistent Subset: A Tabu Search Approach to the Condensed Nearest Neighbor Rule,” IEEE Trans. Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 31, no. 3, pp. 408413, June 2001.
[14] S.H. Cha and S.N. Srihari, “A Fast Nearest Neighbor Search Algorithm by Filtration,” Pattern Recognition, vol.35, no. 2, pp. 515525, Feb. 2002.
[15] C.C. Chang and C.J. Lin, LIBSVM: A Library for Support Vector Machines, http://www.csie.ntu.edu.tw/~cjlinlibsvm, 2001.
[16] D. Chaudhuri, C.A. Murthy, and B.B. Chaudhuri, “Finding a Subset of Representative Points in a Data Set,” IEEE Trans. Systems, Man, and Cybernetics, vol. 24, no. 9, pp. 14161424, Sept. 1994.
[17] Y.X. Chen, X. Dang, H.X. Peng, and H.L. Bart, “Outlier Detection with the Kernelized Spatial Depth Function,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 31, no. 2, pp. 288305, Feb. 2009.
[18] S.H. Choi and P. Rockett, “The Training of Neural Classifiers with Condensed Data sets,” IEEE Trans. Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 32, no. 2, pp. 202206, Apr. 2002.
[19] B.V. Dasarathy, “Minimal Consistent Set (MCS) Identification for Optimal Nearest Neighbor Decision Systems Design,” IEEE Trans. Systems, Man, and Cybernetics, vol. 24, no. 3, pp. 511517, Mar. 1994.
[20] J. Demsar, “Statistical Comparisons of Classifiers over Multiple Data Sets,” J. Machine Learning Research, vol. 7, pp. 130, Jan. 2006.
[21] G.M. Foody, “The Significance of Border Training Patterns in Classification by a Feedforward Neural Network Using Back Propagation Learning,” Int'l J. Remote Sensing, vol. 20, no. 18, pp. 35493562, Dec. 1999.
[22] J.H. Friedman, J.L. Bentley, and R.A. Finkel, “An Algorithm for Finding Best Matches in Logarithmic Expected Time,” ACM Trans. Math. Software, vol. 3, no. 3, pp. 209226, Sept. 1977.
[23] K. Fukunaga, Introduction to Statistical Pattern Recognition, second ed. Morgan Kaufmann, 1990.
[24] K. Fukunage and P.M. Narendra, “A Branch and Bound Algorithm for Computing KNearest Neighbors,” IEEE Trans. Computers, vol. 24, no. 7, pp. 750753, July 1975.
[25] K. Fukunaga and L.D Hostetle, “Optimization of k NearestNeighbor Density Estimates,” IEEE Trans. Information Theory, vol. 19, no. 3, pp. 320326, May 1973.
[26] A.K. Ghosh, “On Nearest Neighbor Classification Using Adaptive Choice of k,” J. Computational and Graphical Statistics, vol. 16, no. 2, pp. 482502, June 2007.
[27] A.K. Ghosh, “On Optimum Choice of k in Nearest Neighbour Classification,” Computational Statistical & Data Analysis, vol. 50, pp. 31133123, 2006.
[28] G. Guo and J.S. Zhang, “Reducing Examples to Accelerate Support Vector Regression,” Pattern Recognition Lectters, vol. 28, no. 16, pp. 21732183, 2007.
[29] P. Hart, “The Condensed Nearest Neighbor Rule,” IEEE Trans. Information Theory, vol. 14, no. 3, pp. 515516, May 1968.
[30] T.K. Ho and M. Basu, “Complexity Measures of Supervised Classification Problems,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 3, pp. 289300, Mar. 2002.
[31] S.W. Kim and B.J. Oommen, “A Brief Taxonomy and Ranking of Creative Prototype Reduction Schemes,” Pattern Analysis and Applications J., vol. 6, no. 3, pp. 232244, 2004.
[32] S.W. Kim and B.J. Oommen, “Enhancing Prototype Reduction Schemes with LVQ3Type Algorithms,” Pattern Recognition, vol. 36, no. 5, pp. 10831093, 2003.
[33] M.A. Kramer and J.A. Leonard, “Diagnosis Using Backpropagation Neural Networks—Analysis and Criticism,” Computers & Chemical Eng., vol. 14, no. 12, pp. 13231338, 1990.
[34] J. Li, M.T. Manry, C. Yu, and D.R. Wilson, “Prototype Classifier Design with Pruning,” Int'l J. Artificial Intelligence Tools, vol. 14, nos. 1/2, pp. 261280, 2005.
[35] Y.H. Li, M.J. Pont, and N.B. Jones, “Improving the Performance of Radial Basis Function Classifiers in Condition Monitoring and Fault Diagnosis Applications Where ‘Unknown’ Faults May Occur,” Pattern Recognition Letters, vol. 23, no. 5, pp. 569577, Mar. 2002.
[36] Y.H. Li, M.J. Pont, N.B. Jones, and J.A. Twiddle, “Applying MLP and RBF Classifiers in Embedded Condition Monitoring and Fault Diagnosis Systems,” Trans. Inst. of Measurement and Control, vol. 23, no. 5, pp. 315343, 2001.
[37] T. Lin and H.B. Zha, “Riemannian Manifold Learning,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 30, no. 5, pp. 796809, May 2008.
[38] M.T. Lozano, J.M. Sotoca, J.S. Sánchez, F. Pla, E. Pekalska, and R.P.W. Duin, “Experimental Study on Prototype Optimisation Algorithms for PrototypeBased Classification in Vector Spaces,” Pattern Recognition, vol. 39, no. 10, pp. 18271838, 2006.
[39] A. Lyhyaoui, M. Martinez, I. Mora, M. Vazquez, J.L. Sancho, and A.R. FigueirasVidal, “Sample Selection via Clustering to Construct Support VectorLike Classifiers,” IEEE Trans. Neural Networks, vol. 10, no. 6, pp. 14741481, Nov. 1999.
[40] E. Marchiori, “Hit Miss Networks with Applications to Instance Selection,” J. Machine Learning Research, vol. 9, pp. 9971017, 2008.
[41] E. Marchiori, “Class Conditional Nearest Neighbor for Large Margin Instance Selection,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 32, no. 2, pp. 364370, Feb. 2010.
[42] M. Markou and S. Singh, “A Neural NetworkBased Novelty Detector for Image Sequence Analysis,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 28, no. 10, pp. 16641677, Oct. 2006.
[43] K.G. Mehrotra, C.K. Mohan, and S. Ranka, “Bounds on the Number of Samples Needed for Neural Learning,” IEEE Trans. Neural Networks, vol. 2, no. 6, pp. 548558, Nov. 1991.
[44] N. Mekuz and J.K. Tsotsos, “Parameterless Isomap with Adaptive Neighborhood Selection,” Proc. DAGM Symp., K. Franke, et al., eds., pp. 364373, 2006.
[45] D.J. Newman, S. Hettich, C.L. Blake, and C.J. Merz, UCI Repository of Machine Learning Databases, Dept. of Information and Computer Science, http://www.ics.uci.edu/~mlearnMLRepository.html , Univ. of California, 1998.
[46] R. Paredes and E. Vidal, “Learning Prototypes and Distances: A Prototype Reduction Technique Based on Nearest Neighbor Error Minimization,” Pattern Recognition, vol. 39, no. 2, pp. 180188, 2006.
[47] E. Pekalska, R.P.W. Duin, and P. Paclik, “Prototype Selection for DissimilarityBased Classifiers,” Pattern Recognition, vol. 39, no. 2, pp. 189208, Feb. 2006.
[48] R.T. Peres and C.E. Pedreira, “Generalized Risk Zone: Selecting Observations for Classification,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 31, no. 7, pp. 13311337, July 2009.
[49] D. Prokhorov, “IJCNN 2001 Neural Network Competition,” Slide presentation at Int'l Joint Conf. Neural Networks, http://www.geocities.com/ijcnnnnc_ijcnn01.pdf , 2001.
[50] T. Raicharoen and C. Lursinsap, “A DivideandConquer Approach to the Pairwise Opposite ClassNearest Neighbor (POCNN) Algorithm,” Pattern Recognition Letters, vol. 26, no. 10, pp. 15541567, July 2005.
[51] G. Ratsch, T. Onoda, and K.R. Muller, “Soft Margins for AdaBoost,” Machine Learning, vol. 42, no. 2, pp.287320, 2001.
[52] D.E. Rumelhart, G.E. Hinton, and R.J. Williams, “Learning Representations by BackPropagating Errors,” Nature, vol. 323, pp. 533536, Oct. 1986.
[53] H. Samet, “KNearest Neighbor Finding Using MaxNearestDist,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 30, no. 2, pp.243252, Feb. 2008.
[54] O. Samko, A.D. Marshall, and P.L. Rosin, “Selection of the Optimal Parameter Value for the Isomap Algorithm,” Pattern Recognition Letters, vol. 27, no. 9, pp. 968979, June 2006.
[55] J.S. Sánchez and A.I. Marqués, “An LVQBased Adaptive Algorithm for Learning from Very Small Codebooks,” Neurocomputing, vol. 69, nos. 79, pp. 922927, 2006.
[56] J.S. Sánchez, “High Training Set Size Reduction by Space Partitioning and Prototype Abstraction,” Pattern Recognition, vol. 37, no. 7, pp. 15611564, 2004.
[57] J.S. Sánchez, F. Pla, and F.J. Ferri, “Prototype Selection for the Nearest Neighbour Rule through Proximity Graphs,” Pattern Recognition Letters, vol. 18, no. 6, pp. 507513, 1997.
[58] J.L. Sancho, W.E. Pierson, B. Ulug, A.R. FigueirasVidal, and S.C. Ahalt, “Class Separability Estimation and Incremental Learning Using Boundary Methods,” Neurocomputing, vol. 35, pp. 326, Nov. 2000.
[59] L.K. Saul and S.T. Roweis, “Think Globally, Fit Locally: Unsupervised Learning of Low Dimensional Manifolds,” J. Machine Learning Research, vol. 4, no. 2, pp. 119155, Feb. 2004.
[60] B. Schölkopf, R.C. Williamson, A.J. Smola, J.S. Taylor, and J.C. Platt, “Support Vector Method for Novelty Detection,” Advances in Neural Information Processing Systems, vol. 12, pp. 582588, MIT Press, 2000.
[61] H. Shin and S. Cho, “Fast Pattern Selection for Support Vector Classifiers,” Proc. Conf. Advances in Knowledge Discovery and Data Mining, vol. 2637, pp. 376387, 2003.
[62] H. Shin and S. Cho, “Neighborhood PropertyBased Pattern Selection for Support Vector Machines,” Neural Computation, vol. 19, no. 3, pp. 816855, Mar. 2007.
[63] T. Tambouratzis, “CounterClustering for Training Pattern Selection,” Computer J., vol. 43, no. 3, pp. 177190, 2000.
[64] D.M.J. Tax and R.P.W. Duin, “Support Vector Data Description,” Machine Learning, vol. 54, pp.4566, 2004.
[65] J.B. Tenenbaum, V.de. Silva, and J.C. Langford, “A Global Geometric Framework for Nonlinear Dimensionality Reduction,” Science, vol. 290, no. 5500, pp. 23192323, Dec. 2000.
[66] V.N. Vapnik, The Nature of Statistical Learning Theory. SpringerVerlag, 1995.
[67] K.R. Varshney and A.S. Willsky, “Classification Using Geometric Level Sets,” J. Machine Learning Research, vol. 11, pp. 491516, Feb. 2010.
[68] C.J. Veenman and M.J.T. Reinders, “The Nearest Subclass Classifier: A Compromise between the Nearest Mean and Nearest Neighbor Classifier,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 27, no. 9, pp. 1417 1429, Sept. 2005.
[69] G.H. Wen, L.J. Jiang, and J. Wen, “Using Locally Estimated Geodesic Distance to Optimize Neighborhood Graph for Isometric Data Embedding,” Pattern Recognition, vol. 41, no. 7, pp. 22262236, July 2008.
[70] D.R. Wilson and T.R. Martinez, “Reduction Techniques for InstanceBased Learning Algorithms,” Machine Learning, vol. 38, no. 3, pp. 257286, Mar. 2000.
[71] H. Zhang and G. Sun, “Optimal Reference Subset Selection for Nearest Neighbor Classification by Tabu Search,” Pattern Recognition, vol. 35, no. 7, pp. 14811490, 2002.