The Community for Technology Leaders
RSS Icon
Issue No.06 - June (2011 vol.33)
pp: 1189-1201
Yuhua Li , University of Ulster, Londonderry
Pattern selection methods have been traditionally developed with a dependency on a specific classifier. In contrast, this paper presents a method that selects critical patterns deemed to carry essential information applicable to train those types of classifiers which require spatial information of the training data set. Critical patterns include those edge patterns that define the boundary and those border patterns that separate classes. The proposed method selects patterns from a new perspective, primarily based on their location in input space. It determines class edge patterns with the assistance of the approximated tangent hyperplane of a class surface. It also identifies border patterns between classes using local probability. The proposed method is evaluated on benchmark problems using popular classifiers, including multilayer perceptrons, radial basis functions, support vector machines, and nearest neighbors. The proposed approach is also compared with four state-of-the-art approaches and it is shown to provide similar but more consistent accuracy from a reduced data set. Experimental results demonstrate that it selects patterns sufficient to represent class boundary and to preserve the decision surface.
Pattern selection, data reduction, border pattern, edge pattern.
Yuhua Li, "Selecting Critical Patterns Based on Local Geometrical and Statistical Information", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.33, no. 6, pp. 1189-1201, June 2011, doi:10.1109/TPAMI.2010.188
[1] , 2009.
[2] ELENA/databases/REALphoneme/, 2009.
[3] S. Ajoka, S. Tsuge, M. Shishibori, and K. Kita, “Fast Multidimensional Nearest Neighbor Search Algorithm Using Priority Queue,” Electrical Eng. in Japan, vol. 164, no. 3, pp. 69-77, 2008.
[4] F. Angiulli, “Fast Nearest Neighbor Condensation for Large Data Sets Classification,” IEEE Trans. Knowledge and Data Eng., vol. 19, no. 11, pp. 1450-1464, Nov. 2007.
[5] F. Angiulli, “Condensed Nearest Neighbor Data Domain Description,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 29, no. 10, pp. 1746-1758, Oct. 2007.
[6] R. Barandela, F.J. Ferri, and J.S. Sanchez, “Decision Boundary Preserving Prototype Selection for Nearest Neighbour Classification,” Int'l J. Pattern Recognition and Artificial Intelligence, vol. 19, no. 6, pp. 787-806, Sept. 2005.
[7] J.L. Bentley and J.H. Friedman, “Data-Structures for Range Searching,” ACM Computing Surveys, vol. 11, no. 4, pp. 397-409, 1979.
[8] J.C. Bezdek and L.I. Kuncheva, “Nearest Prototype Classifier Designs: An Experimental Study,” Int'l J. Intelligent Systems, vol. 16, no. 12, pp. 1445-1473, Dec. 2001.
[9] D.S. Broomhead and D. Lowe, “Multivariable Function Interpolation and Adaptive Networks,” Complex Systems, vol. 2, pp. 321-335, 1988.
[10] C.J.C. Burges, “A Tutorial on Support Vector Machines for Pattern Recognition,” Data Mining and Knowledge Discovery, vol. 2, pp. 121-167, 1998.
[11] C. Cachin, “Pedagogical Pattern Selection-Strategies,” Neural Networks, vol. 7, no. 1, pp. 175-181, 1994.
[12] G.C. Cawley and N.L.C. Talbot, “Efficient Leave-One-Out Cross-Validation of Kernel Fisher Discriminant Classifiers,” Pattern Recognition, vol. 36, pp. 2585-2592, 2003.
[13] V. Cerverón and F.J. Ferri, “Another Move Toward the Minimum Consistent Subset: A Tabu Search Approach to the Condensed Nearest Neighbor Rule,” IEEE Trans. Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 31, no. 3, pp. 408-413, June 2001.
[14] S.H. Cha and S.N. Srihari, “A Fast Nearest Neighbor Search Algorithm by Filtration,” Pattern Recognition, vol.35, no. 2, pp. 515-525, Feb. 2002.
[15] C.C. Chang and C.J. Lin, LIBSVM: A Library for Support Vector Machines,, 2001.
[16] D. Chaudhuri, C.A. Murthy, and B.B. Chaudhuri, “Finding a Subset of Representative Points in a Data Set,” IEEE Trans. Systems, Man, and Cybernetics, vol. 24, no. 9, pp. 1416-1424, Sept. 1994.
[17] Y.X. Chen, X. Dang, H.X. Peng, and H.L. Bart, “Outlier Detection with the Kernelized Spatial Depth Function,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 31, no. 2, pp. 288-305, Feb. 2009.
[18] S.H. Choi and P. Rockett, “The Training of Neural Classifiers with Condensed Data sets,” IEEE Trans. Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 32, no. 2, pp. 202-206, Apr. 2002.
[19] B.V. Dasarathy, “Minimal Consistent Set (MCS) Identification for Optimal Nearest Neighbor Decision Systems Design,” IEEE Trans. Systems, Man, and Cybernetics, vol. 24, no. 3, pp. 511-517, Mar. 1994.
[20] J. Demsar, “Statistical Comparisons of Classifiers over Multiple Data Sets,” J. Machine Learning Research, vol. 7, pp. 1-30, Jan. 2006.
[21] G.M. Foody, “The Significance of Border Training Patterns in Classification by a Feedforward Neural Network Using Back Propagation Learning,” Int'l J. Remote Sensing, vol. 20, no. 18, pp. 3549-3562, Dec. 1999.
[22] J.H. Friedman, J.L. Bentley, and R.A. Finkel, “An Algorithm for Finding Best Matches in Logarithmic Expected Time,” ACM Trans. Math. Software, vol. 3, no. 3, pp. 209-226, Sept. 1977.
[23] K. Fukunaga, Introduction to Statistical Pattern Recognition, second ed. Morgan Kaufmann, 1990.
[24] K. Fukunage and P.M. Narendra, “A Branch and Bound Algorithm for Computing K-Nearest Neighbors,” IEEE Trans. Computers, vol. 24, no. 7, pp. 750-753, July 1975.
[25] K. Fukunaga and L.D Hostetle, “Optimization of k Nearest-Neighbor Density Estimates,” IEEE Trans. Information Theory, vol. 19, no. 3, pp. 320-326, May 1973.
[26] A.K. Ghosh, “On Nearest Neighbor Classification Using Adaptive Choice of k,” J. Computational and Graphical Statistics, vol. 16, no. 2, pp. 482-502, June 2007.
[27] A.K. Ghosh, “On Optimum Choice of k in Nearest Neighbour Classification,” Computational Statistical & Data Analysis, vol. 50, pp. 3113-3123, 2006.
[28] G. Guo and J.S. Zhang, “Reducing Examples to Accelerate Support Vector Regression,” Pattern Recognition Lectters, vol. 28, no. 16, pp. 2173-2183, 2007.
[29] P. Hart, “The Condensed Nearest Neighbor Rule,” IEEE Trans. Information Theory, vol. 14, no. 3, pp. 515-516, May 1968.
[30] T.K. Ho and M. Basu, “Complexity Measures of Supervised Classification Problems,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 3, pp. 289-300, Mar. 2002.
[31] S.W. Kim and B.J. Oommen, “A Brief Taxonomy and Ranking of Creative Prototype Reduction Schemes,” Pattern Analysis and Applications J., vol. 6, no. 3, pp. 232-244, 2004.
[32] S.W. Kim and B.J. Oommen, “Enhancing Prototype Reduction Schemes with LVQ3-Type Algorithms,” Pattern Recognition, vol. 36, no. 5, pp. 1083-1093, 2003.
[33] M.A. Kramer and J.A. Leonard, “Diagnosis Using Backpropagation Neural Networks—Analysis and Criticism,” Computers & Chemical Eng., vol. 14, no. 12, pp. 1323-1338, 1990.
[34] J. Li, M.T. Manry, C. Yu, and D.R. Wilson, “Prototype Classifier Design with Pruning,” Int'l J. Artificial Intelligence Tools, vol. 14, nos. 1/2, pp. 261-280, 2005.
[35] Y.H. Li, M.J. Pont, and N.B. Jones, “Improving the Performance of Radial Basis Function Classifiers in Condition Monitoring and Fault Diagnosis Applications Where ‘Unknown’ Faults May Occur,” Pattern Recognition Letters, vol. 23, no. 5, pp. 569-577, Mar. 2002.
[36] Y.H. Li, M.J. Pont, N.B. Jones, and J.A. Twiddle, “Applying MLP and RBF Classifiers in Embedded Condition Monitoring and Fault Diagnosis Systems,” Trans. Inst. of Measurement and Control, vol. 23, no. 5, pp. 315-343, 2001.
[37] T. Lin and H.B. Zha, “Riemannian Manifold Learning,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 30, no. 5, pp. 796-809, May 2008.
[38] M.T. Lozano, J.M. Sotoca, J.S. Sánchez, F. Pla, E. Pekalska, and R.P.W. Duin, “Experimental Study on Prototype Optimisation Algorithms for Prototype-Based Classification in Vector Spaces,” Pattern Recognition, vol. 39, no. 10, pp. 1827-1838, 2006.
[39] A. Lyhyaoui, M. Martinez, I. Mora, M. Vazquez, J.L. Sancho, and A.R. Figueiras-Vidal, “Sample Selection via Clustering to Construct Support Vector-Like Classifiers,” IEEE Trans. Neural Networks, vol. 10, no. 6, pp. 1474-1481, Nov. 1999.
[40] E. Marchiori, “Hit Miss Networks with Applications to Instance Selection,” J. Machine Learning Research, vol. 9, pp. 997-1017, 2008.
[41] E. Marchiori, “Class Conditional Nearest Neighbor for Large Margin Instance Selection,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 32, no. 2, pp. 364-370, Feb. 2010.
[42] M. Markou and S. Singh, “A Neural Network-Based Novelty Detector for Image Sequence Analysis,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 28, no. 10, pp. 1664-1677, Oct. 2006.
[43] K.G. Mehrotra, C.K. Mohan, and S. Ranka, “Bounds on the Number of Samples Needed for Neural Learning,” IEEE Trans. Neural Networks, vol. 2, no. 6, pp. 548-558, Nov. 1991.
[44] N. Mekuz and J.K. Tsotsos, “Parameterless Isomap with Adaptive Neighborhood Selection,” Proc. DAGM Symp., K. Franke, et al., eds., pp. 364-373, 2006.
[45] D.J. Newman, S. Hettich, C.L. Blake, and C.J. Merz, UCI Repository of Machine Learning Databases, Dept. of Information and Computer Science, , Univ. of California, 1998.
[46] R. Paredes and E. Vidal, “Learning Prototypes and Distances: A Prototype Reduction Technique Based on Nearest Neighbor Error Minimization,” Pattern Recognition, vol. 39, no. 2, pp. 180-188, 2006.
[47] E. Pekalska, R.P.W. Duin, and P. Paclik, “Prototype Selection for Dissimilarity-Based Classifiers,” Pattern Recognition, vol. 39, no. 2, pp. 189-208, Feb. 2006.
[48] R.T. Peres and C.E. Pedreira, “Generalized Risk Zone: Selecting Observations for Classification,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 31, no. 7, pp. 1331-1337, July 2009.
[49] D. Prokhorov, “IJCNN 2001 Neural Network Competition,” Slide presentation at Int'l Joint Conf. Neural Networks, , 2001.
[50] T. Raicharoen and C. Lursinsap, “A Divide-and-Conquer Approach to the Pairwise Opposite Class-Nearest Neighbor (POC-NN) Algorithm,” Pattern Recognition Letters, vol. 26, no. 10, pp. 1554-1567, July 2005.
[51] G. Ratsch, T. Onoda, and K.R. Muller, “Soft Margins for AdaBoost,” Machine Learning, vol. 42, no. 2, pp.287-320, 2001.
[52] D.E. Rumelhart, G.E. Hinton, and R.J. Williams, “Learning Representations by Back-Propagating Errors,” Nature, vol. 323, pp. 533-536, Oct. 1986.
[53] H. Samet, “K-Nearest Neighbor Finding Using MaxNearestDist,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 30, no. 2, pp.243-252, Feb. 2008.
[54] O. Samko, A.D. Marshall, and P.L. Rosin, “Selection of the Optimal Parameter Value for the Isomap Algorithm,” Pattern Recognition Letters, vol. 27, no. 9, pp. 968-979, June 2006.
[55] J.S. Sánchez and A.I. Marqués, “An LVQ-Based Adaptive Algorithm for Learning from Very Small Codebooks,” Neurocomputing, vol. 69, nos. 7-9, pp. 922-927, 2006.
[56] J.S. Sánchez, “High Training Set Size Reduction by Space Partitioning and Prototype Abstraction,” Pattern Recognition, vol. 37, no. 7, pp. 1561-1564, 2004.
[57] J.S. Sánchez, F. Pla, and F.J. Ferri, “Prototype Selection for the Nearest Neighbour Rule through Proximity Graphs,” Pattern Recognition Letters, vol. 18, no. 6, pp. 507-513, 1997.
[58] J.L. Sancho, W.E. Pierson, B. Ulug, A.R. Figueiras-Vidal, and S.C. Ahalt, “Class Separability Estimation and Incremental Learning Using Boundary Methods,” Neurocomputing, vol. 35, pp. 3-26, Nov. 2000.
[59] L.K. Saul and S.T. Roweis, “Think Globally, Fit Locally: Unsupervised Learning of Low Dimensional Manifolds,” J. Machine Learning Research, vol. 4, no. 2, pp. 119-155, Feb. 2004.
[60] B. Schölkopf, R.C. Williamson, A.J. Smola, J.S. Taylor, and J.C. Platt, “Support Vector Method for Novelty Detection,” Advances in Neural Information Processing Systems, vol. 12, pp. 582-588, MIT Press, 2000.
[61] H. Shin and S. Cho, “Fast Pattern Selection for Support Vector Classifiers,” Proc. Conf. Advances in Knowledge Discovery and Data Mining, vol. 2637, pp. 376-387, 2003.
[62] H. Shin and S. Cho, “Neighborhood Property-Based Pattern Selection for Support Vector Machines,” Neural Computation, vol. 19, no. 3, pp. 816-855, Mar. 2007.
[63] T. Tambouratzis, “Counter-Clustering for Training Pattern Selection,” Computer J., vol. 43, no. 3, pp. 177-190, 2000.
[64] D.M.J. Tax and R.P.W. Duin, “Support Vector Data Description,” Machine Learning, vol. 54, pp.45-66, 2004.
[65] J.B. Tenenbaum, Silva, and J.C. Langford, “A Global Geometric Framework for Nonlinear Dimensionality Reduction,” Science, vol. 290, no. 5500, pp. 2319-2323, Dec. 2000.
[66] V.N. Vapnik, The Nature of Statistical Learning Theory. Springer-Verlag, 1995.
[67] K.R. Varshney and A.S. Willsky, “Classification Using Geometric Level Sets,” J. Machine Learning Research, vol. 11, pp. 491-516, Feb. 2010.
[68] C.J. Veenman and M.J.T. Reinders, “The Nearest Subclass Classifier: A Compromise between the Nearest Mean and Nearest Neighbor Classifier,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 27, no. 9, pp. 1417- 1429, Sept. 2005.
[69] G.H. Wen, L.J. Jiang, and J. Wen, “Using Locally Estimated Geodesic Distance to Optimize Neighborhood Graph for Isometric Data Embedding,” Pattern Recognition, vol. 41, no. 7, pp. 2226-2236, July 2008.
[70] D.R. Wilson and T.R. Martinez, “Reduction Techniques for Instance-Based Learning Algorithms,” Machine Learning, vol. 38, no. 3, pp. 257-286, Mar. 2000.
[71] H. Zhang and G. Sun, “Optimal Reference Subset Selection for Nearest Neighbor Classification by Tabu Search,” Pattern Recognition, vol. 35, no. 7, pp. 1481-1490, 2002.
7 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool