This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Theoretical and Practical Considerations of Uncertainty and Complexity in Automated Knowledge Acquisition
October 1995 (vol. 7 no. 5)
pp. 699-712

Abstract—Inductive machine learning has become an important approach to automated knowledge acquisition from databases. The disjunctive normal form (DNF), as the common analytic representation of decision trees and decision tables (rules), provides a basis for formal analysis of uncertainty and complexity in inductive learning. In this paper, a theory for general decision trees is developed based on Shannon’s expansion of the discrete DNF, and a probabilistic induction system PIK is further developed for extracting knowledge from real-world data. Then we combine formal and practical approaches to study how data characteristics affect the uncertainty and complexity in inductive learning. Three important data characteristics, namely, disjunctiveness, noise and incompleteness, are studied. The combination of leveled-pruning, leveled-condensing and resampling-estimation turns out to be a very powerful method for dealing with highly-disjunctive and inadequate data. Finally the PIK system is compared with other recent inductive learning systems on a number of real-world domains.

[1] A. Blumer, A. Ehrenfeucht, D. Haussler, and M. Warmuth, "Learnability and the Vapnik-Chervonenkis Dimension," J. ACM, vol. 36, pp. 929-965, 1989.
[2] L. Breiman,J.H. Friedman,R.A. Olshen,, and C.J. Stone,Classification and Regression Trees.Belmont: Wadsworth, 1984.
[3] W.L. Buntine,“Inductive knowledge acquisition and inductive methodologies,” Knowledge-Based Systems, vol. 2, no. 1, pp. 52-61, 1989.
[4] P. Clark and R. Boswell,“Rule induction with CN2: Some recent improvements,” Machine Learning—EWSL-91, Y. Kodratoff, ed., pp. 151-163,Berlin: Springer Verlag, 1991.
[5] M. Davio and G. Bioul,“Representation of lattice functions,” Philips Research Report, vol. 25, pp. 370-388, 1970.
[6] P.W. Frey and D.J. Slate,“Letter recognition using Holland-style adaptive classifiers,” Machine Learning, vol. 6, pp. 161-182, 1991.
[7] M.R. Garey and D.S. Johnson, Computers and Intractability: A Guide to the Theory of NP-Completeness.New York: W.H. Freeman, 1979.
[8] L.A. Goodman and W.H. Kruskal,“Measures of association for cross-classifications,” J. American Statististical Association, vol. 49, pp. 732-764, 1954.
[9] R. Lopez de Mantaras, "A Distance-Based Attribute Selection Measure for Decision Tree Induction," Machine Learning, vol. 6, pp. 81-92, 1991.
[10] B.H. Margolin and R.J. Light,“An analysis of variance for categorical data, II,” J. American Statistical Association, vol. 69, pp. 755-764, 1974.
[11] B.M.E. Moret,“Decision trees and diagrams,” Computing Surveys, vol. 14, no. 4, pp. 593-623, 1982.
[12] T. Niblett,“Constructing decision trees in noisy domains,” Progress in Machine Learning, I. Bratko and N. Lavrac, eds., Wilmslow: Sigma, 1987.
[13] G. Pagallo,“Learning DNF by decision trees,” Proc. 11th IJCAI Conf., pp. 639-644,Detroit, 1989.
[14] J.R. Quinlan,"Induction of decision trees," Machine Learning, vol. 1, pp. 81-106, 1986.
[15] J.R. Quinlan,P.J. Compton,K.A. Horn, , and L. Lazarus,“Inductive knowledge acquisition: a case study,” Applications of Expert Systems, J.R. Quinlan, ed., pp. 157-173,Sydney, Australia: Addition-Wesley, 1987.
[16] L.A. Rendell and H. Cho,“Empirical learning as a function of concept character,” Machine Learning, vol. 5, no. 3, pp. 267-298, 1990.
[17] C. Shannon,“The synthesis of two-terminal switching circuits,” Bell System Tech. J., vol. 28, pp. 59-98, 1949.
[18] L.G. Valiant, “A Theory of the Learnable,” Comm. ACM, vol. 27, no. 11, pp. 1134-1142, Nov. 1984.
[19] L. Valiant,“Learning disjunctions of conjunctions,” Proc. Ninth IJCAI Conf., pp. 560-566,Los Angeles, 1985.
[20] Q.R. Wang and C.Y. Suen,“Analysis and design of a decision tree based on entroy reduction and itsapplication to large character set recognition,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 6, pp. 406-417, 1984.
[21] X.J. Zhou and T.S. Dillon,“A statistical-heuristic feature selection criterion for decision treeinduction,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 13, pp. 834-841, 1991.
[22] X.J. Zhou,“Decision trees for structured probabilistic induction of knowledge,” PhD thesis, La Trobe Univ., Melbourne, 1993.

Index Terms:
Complexity, decision trees, disjunctive normal forms, knowledge acquisition, machine learning, uncertainty
Citation:
Xiao-Jia M. Zhou, Tharam S. Dillon, "Theoretical and Practical Considerations of Uncertainty and Complexity in Automated Knowledge Acquisition," IEEE Transactions on Knowledge and Data Engineering, vol. 7, no. 5, pp. 699-712, Oct. 1995, doi:10.1109/69.469826
Usage of this product signifies your acceptance of the Terms of Use.