This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Dimensionality Reduction in Automatic Knowledge Acquisition: A Simple Greedy Search Approach
November/December 2003 (vol. 15 no. 6)
pp. 1364-1373

Abstract—Knowledge acquisition is the process of collecting domain knowledge, documenting the knowledge, and transforming it into a computerized representation. Due to the difficulties involved in eliciting knowledge from human experts, knowledge acquisition was identified as a bottleneck in the development of knowledge-based system. Over the past decades, a number of automatic knowledge acquisition techniques have been developed. However, the performance of these techniques suffers from the so called curse of dimensionality, i.e., difficulties arise when many irrelevant (or redundant) parameters exist. This paper presents a heuristic approach based on statistics and greedy search for dimensionality reduction to facilitate automatic knowledge acquisition. The approach deals with classification problems. Specifically, Chi-square statistics are used to rank the importance of individual parameters. Then, a backward search procedure is employed to eliminate parameters (less important parameters first) that do not contribute to class separability. The algorithm is very efficient and was found to be effective when applied to a variety of problems with different characteristics.

[1] M. Caudill, Using Neural Nets: Hybrid Expert Networks AI Expert, vol. 5, no. 11, pp. 49-54, 1990.
[2] A.J. Gonzalez and D.D. Dankel, The Engineering of Knowledge-Based Systems: Theory and Practice. Englewood Cliffs, N.J.: Prentice Hall, 1993.
[3] F. Hayes-Roth,D.A. Waterman,, and D.B. Lenat,Building Expert Systems, Addison-Wesley, Reading, Mass., 1983.
[4] E.A. Feigenbaum, Themes and Case Studies of Knowledge Engineering Expert Systems in the Micro-Electronic Age, D. Michie, ed., pp. 3-25, Edinburgh, Scotland: Edinburgh Univ. Press, 1979.
[5] T. Mitchell, Machine Learning, McGraw-Hill, 1997.
[6] P.M. Lewis, The Characteristic Selection Problem in Recognition Systems IEEE Trans. Information Theory, vol. 8, pp. 171-178, 1962.
[7] G. Sebestyen, Decision-Making Processes in Pattern Recognition. New York: MacMillan, 1962.
[8] H. Liu and H. Motoda, Feature Selection for Knowledge Discovery and Data Mining. Dordrecht, The Netherlands: Kluwer Academic, 1998.
[9] W. Siedlecki and J. Sklansky, On Automatic Feature Selection Int'l J. Pattern Recognition and Artificial Intelligence, vol. 2, pp. 197-220, 1988.
[10] A. Jain and D. Zongker, Feature Selection: Evaluation, Application, and Small Sample Performance IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 19, no. 2, pp. 153-158, Feb. 1997.
[11] R. Kohavi and G.H. John, Wrappers for Feature Subset Selection Artificial Intelligence, vol. 97, pp. 273-324, 1997.
[12] A. Blum and P. Langley, Selection of Relevant Features and Examples in Machine Learning Artificial Intelligence, vol. 97, nos. 1-2, pp. 245-271, 1997.
[13] M. Ben-Bassat, Pattern Recognition and Reduction of Dimensionality Handbook of Statistics-II, P.R. Krishnaiah and L.N. Kanal, eds., pp. 773-791, New York: North Holland, 1968.
[14] H. Liu and R. Setiono, Incremental Feature Selection Applied Intelligence, vol. 9, pp. 217-230, 1998.
[15] P.M. Narendra and K. Fukunaga, A Branch and Bound Algorithm for Feature Subset Selection IEEE Trans. Computers, vol. 26, no. 9, pp. 917-922, Sept. 1977.
[16] Y. Hamamoto, S. Uchimura, Y. Matsunra, T. Kanaoka, and S. Tomita, Evaluation of the Branch and Bound Algorithm for Feature Selection Pattern Recognition Letters, vol. 11, pp. 403-413, 1990.
[17] H. Liu and R. Setiono, Some Issues on Scalable Feature Selection Expert Systems with Applications, vol. 15, pp. 333-339, 1998.
[18] G. Brassard and P. Bratley, Fundamentals of Algorithms. Englewood Cliffs, N.J.: Prentice Hall, 1988.
[19] W. Siedlecki and J. Sklanski, "A Note on Genetic Algorithms for Large Scale Feature Selection," Pattern Recognition Letters, vol. 10, pp. 335-347, 1989.
[20] R.E. Walpole, R.H. Myers, and S.L. Myers, Probability and Statistics for Engineers and Scientists. Upper Saddle River, N.J.: Prentice Hall, 1998.
[21] S.B. Thrun, J. Bala, E. Bloedorn, I. Bratko, B. Cestnik, J. Cheng, K. DeJong, S. Dzeroski, S.E. Fahlman, D. Fisher, R. Hamann, K. Kaufman, S. Keller, I. Kononenko, J. Kreuziger, R.S. Michalski, T. Mitchell, P. Pachowicz, Y. Reich, H. Vafaie, W. Van de Welde, W. Wenzel, J. Wnek, and J. Zhang, The MONK's Problems: A Performance Comparison of Different Learning Algorithms Technical Report CMU-CS-91-197, Carnegie Mellon Univ., 1991.
[22] http://www.ics.uci.edu/~mlearn/MLRepository.html, 2003.
[23] J.L. HodgesJr. and E.L. Lehmann, Basic Concepts of Probability and Statistics. San Francisco: Holden-Day, 1970.
[24] http://www.comp.nus.edu.sq/~liuh/Fsbook, 2003.
[25] J.R. Quinlan, C4.5: Programs for Machine Learning,San Mateo, Calif.: Morgan Kaufman, 1992.
[26] D. Wettschereck, D. Aha, and T. Mohri, "A Review and Empirical Comparison of Feature Weighting Methods for a Class of Lazy Learning Algorithms," Artificial Intelligence Rev., vol. 11, no. 1-5, Feb. 1997, pp. 237-314.
[27] J. Kittler, Feature Set Search Algorithm Pattern Recognition and Signal Processing, pp. 41-60, 1978.
[28] B. Mirkin, Concept Learning and Feature Selection Based on Square-Error Clustering Machine Learning, vol. 35, pp. 25-39, 1999.
[29] H. Liu and R. Setiono, "Chi2: Feature Selection and Discretization of Numeric Attributes," Proc. Seventh Int'l Conf. Tools with Artificial Intelligence, pp. 388-391, Nov. 1995.

Index Terms:
Curse of dimensionality, feature selection, Chi-square test of independence, greedy search, classification.
Citation:
Samuel H. Huang, "Dimensionality Reduction in Automatic Knowledge Acquisition: A Simple Greedy Search Approach," IEEE Transactions on Knowledge and Data Engineering, vol. 15, no. 6, pp. 1364-1373, Nov.-Dec. 2003, doi:10.1109/TKDE.2003.1245278
Usage of this product signifies your acceptance of the Terms of Use.