Publication 2009 Issue No. 7 - July Abstract - A Novel Feature Selection Methodology for Automated Inspection Systems
 This Article Share Bibliographic References Add to: Digg Furl Spurl Blink Simpy Google Del.icio.us Y!MyWeb Search Similar Articles Articles by Hugo C. Garcia Articles by Jesus Rene Villalobos Articles by Rong Pan Articles by George C. Runger
A Novel Feature Selection Methodology for Automated Inspection Systems
July 2009 (vol. 31 no. 7)
pp. 1338-1344
 ASCII Text x Hugo C. Garcia, Jesus Rene Villalobos, Rong Pan, George C. Runger, "A Novel Feature Selection Methodology for Automated Inspection Systems," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 7, pp. 1338-1344, July, 2009.
 BibTex x @article{ 10.1109/TPAMI.2008.276,author = {Hugo C. Garcia and Jesus Rene Villalobos and Rong Pan and George C. Runger},title = {A Novel Feature Selection Methodology for Automated Inspection Systems},journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence},volume = {31},number = {7},issn = {0162-8828},year = {2009},pages = {1338-1344},doi = {http://doi.ieeecomputersociety.org/10.1109/TPAMI.2008.276},publisher = {IEEE Computer Society},address = {Los Alamitos, CA, USA},}
 RefWorks Procite/RefMan/Endnote x TY - JOURJO - IEEE Transactions on Pattern Analysis and Machine IntelligenceTI - A Novel Feature Selection Methodology for Automated Inspection SystemsIS - 7SN - 0162-8828SP1338EP1344EPD - 1338-1344A1 - Hugo C. Garcia, A1 - Jesus Rene Villalobos, A1 - Rong Pan, A1 - George C. Runger, PY - 2009KW - Feature selectionKW - misclassification error rateKW - quadratic discriminant function.VL - 31JA - IEEE Transactions on Pattern Analysis and Machine IntelligenceER -
Hugo C. Garcia, L3, Electro-Optical Systems, Tempe
Jesus Rene Villalobos, Arizona State University, Tempe
Rong Pan, Arizona State University, Tempe
George C. Runger, Arizona State University, Tempe
This paper proposes a new feature selection methodology. The methodology is based on the stepwise variable selection procedure, but, instead of using the traditional discriminant metrics such as Wilks' Lambda, it uses an estimation of the misclassification error as the figure of merit to evaluate the introduction of new features. The expected misclassification error rate (MER) is obtained by using the densities of a constructed function of random variables, which is the stochastic representation of the conditional distribution of the quadratic discriminant function estimate. The application of the proposed methodology results in significant savings of computational time in the estimation of classification error over the traditional simulation and cross-validation methods. One of the main advantages of the proposed method is that it provides a direct estimation of the expected misclassification error at the time of feature selection, which provides an immediate assessment of the benefits of introducing an additional feature into an inspection/classification algorithm.

[1] R. Duda, P. Hart, and D. Stork, Pattern Classification, second ed. John Wiley & Sons, 2003.
[2] L. Devroye, L. Gyorfi, and G. Lugosi, A Probabilistic Theory of Pattern Recognition. Springer, 1996.
[3] R. McFarland and D. Richard, “Exact Misclassification Probabilities for Plug-In Normal Quadratic Discriminant Function—The Heterogeneous Case,” J. Multivariate Analysis, vol. 82, pp. 299-330, 2002.
[4] J. Hua, Z. Xiong, and E. Dougherty, “Determination of the Optimal Number of Features for Quadratic Discriminant Analysis via the Normal Approximation to the Discriminant Distribution,” Pattern Recognition, vol. 38, pp. 403-421, 2005.
[5] H.C. Garcia and J.R Villalobos, “Development of a Methodological Framework for the Self Reconfiguration of Automated Visual Inspection Systems,” Proc. Fifth Int'l Conf. Industrial Informatics, July 2007.
[6] S. Kachigan, Multivariate Statistical Analysis. Radius Press, 1982.
[7] A. Jain, R. Duin, and J. Mao, “Statistical Pattern Recognition: A Review,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 1, pp. 4-37, Jan. 2000.
[8] J.A. Zongker, “Feature Selection: Evaluation, Application, and Small Sample Performance,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 19, no. 2, pp. 153-158, Feb. 1997.
[9] P. Mitra, C. Murthy, and S. Pal, “Unsupervised Feature Selection Using Feature Similarity,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 3, pp. 285-297, Mar. 2002.
[10] Y. Saeys, I. Inza, and P. Larranaga, “A Review of Feature Selection Techniques in Bioinformatics,” Bioinformatics, vol. 23, no. 19, pp. 2507-2517, 2007.
[11] P. Liu, N. Wu, and J. Zhu, “A Unified Strategy of Feature Selection,” Proc. Int'l Conf. Advanced Data Mining and Applications, pp. 457-464, 2006.
[12] H. Wei and S. Billings, “Feature Subset Selection and Ranking for Data Dimensionality Reduction,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 29, no. 1, pp. 162-166, Jan. 2007.
[13] S. Sampatraj, K. Abhishek, and Y. Ding, “A Survey of Inspection Strategy and Sensor Distribution Studies in Discrete-Part Manufacturing Processes,” IIE Trans., vol. 38, no. 4, pp. 309-328, 2005.
[14] I. Gunyon, S. Gunn, M. Nikravesh, and L. Zadeh, “Variable-Feature Selection and Ensemble Learning: A Dual View,” Feature Extraction, Foundations and Applications, I. Guyon, L. Zadeh, and M. Nikravesh, eds., Springer-Verlag, 2005.
[15] S. Jiang, R. Kumar, and H. Garcia, “Optimal Sensor Selection for Discrete-Event Systems with Partial Observation,” IEEE Trans. Automatic Control, vol. 48, no. 3, pp. 369-381, Mar. 2003.
[16] H.C. Garcia, J.R. Villalobos, and G. Runger, “Automated Feature Selection for Visual Inspection Systems,” IEEE Trans. Automation Science and Eng., vol. 3, no. 4, pp. 394-406, Oct. 2006.
[17] H. Liu and L. Yu, “Toward Integrating Feature Selection Algorithms for Classification and Clustering,” IEEE Trans. Knowledge and Data Eng., vol. 17, no. 4, pp. 491-502, Apr. 2005.
[18] N. Louw and S. Steel, “Variable Selection in Kernel Fisher Discriminant Analysis by Means of Recursive Feature Elimination,” Computational Statistics and Data Analysis, vol. 51, no. 3, pp. 2043-2055, 2006.
[19] M. Ashihara and S. Abe, “Feature Selection Based on Kernel Discriminant Analysis,” Proc. Int'l Conf. Artificial Neural Networks, Part 2, pp. 282-291, 2006.
[20] C. Le, Applied Categorical Data Analysis. Wiley, 1998.
[21] R. Khattree and D. Naik, Multivariate Data Reduction and Discrimination with SAS Software. Wiley-Interscience, 2000.
[22] C. Park, J. Koo, and P. Kim, “Stepwise Feature Selection Using Generalized Logistic loss,” Computational Statistics and Data Analysis, vol. 52, no. 7, pp.3709-3718, 2008.
[23] A. Rencher, “The Contribution of Individual Variables to Hotelling ${\rm T}^2$ , Wilks and ${\rm R}^2$ ,” Biometrics, vol. 49, pp. 479-489, 1993.
[24] R. Jenrich, “Stepwise Discriminant Analysis,” Statistical Methods for Digital Computers, K. Enslein, ed., John Wiley & Sons, 1997.
[25] A. Rencher, Methods of Multivariate Analysis, second ed. John Wiley & Sons, 2002.
[26] H.C. Garcia, “A Framework for the Self Reconfiguration of Automated Visual Inspection Systems,” PhD dissertation, Dept. of Industrial Eng., Arizona State Univ., 2008.
[27] T.W. Anderson, An Introduction to Multivariate Statistical Analysis, third ed. Wiley, 1984.
[28] A. Wald, “On a Statistical Problem Arising in the Classification of an Individual into One of Two Groups,” The Annals of Math. Statistics, vol. 15, no. 2, pp. 145-162, 1944.
[29] J.R. Villalobos, M. Arellano, A. Medina, and F. Aguirre, “Vector Classification of SMD Images,” J. Manufacturing Systems, vol. 22, no. 4, pp.265-282, 2004.

Index Terms:
Feature selection, misclassification error rate, quadratic discriminant function.
Citation:
Hugo C. Garcia, Jesus Rene Villalobos, Rong Pan, George C. Runger, "A Novel Feature Selection Methodology for Automated Inspection Systems," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 7, pp. 1338-1344, July 2009, doi:10.1109/TPAMI.2008.276