This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Top-Down Induction of Model Trees with Regression and Splitting Nodes
May 2004 (vol. 26 no. 5)
pp. 612-625

Abstract—Model trees are an extension of regression trees that associate leaves with multiple regression models. In this paper, a method for the data-driven construction of model trees is presented, namely, the Stepwise Model Tree Induction (SMOTI) method. Its main characteristic is the induction of trees with two types of nodes: regression nodes, which perform only straight-line regression, and splitting nodes, which partition the feature space. The multiple linear model associated with each leaf is then built stepwise by combining straight-line regressions reported along the path from the root to the leaf. In this way, internal regression nodes contribute to the definition of multiple models and have a "global” effect, while straight-line regressions at leaves have only "local” effects. Experimental results on artificially generated data sets show that SMOTI outperforms two model tree induction systems, M5' and RETIS, in accuracy. Results on benchmark data sets used for studies on both regression and model trees show that SMOTI performs better than RETIS in accuracy, while it is not possible to draw statistically significant conclusions on the comparison with M5'. Model trees induced by SMOTI are generally simple and easily interpretable and their analysis often reveals interesting patterns.

[1] L. Breiman, J. Friedman, R. Olshen, and J. Stone, Classification and Regression Tree. Wadsworth and Brooks, 1984.
[2] P. Chaudhuri, M. Huang, W. Loh, and R. Yao, Piecewise-Polynomial Regression Trees Statistica Sinica, vol. 4, pp. 143-167, 1994.
[3] A. Dobra and J.E. Gehrke, Secret: A Scalable Linear Regression Tree Algorithm Proc. Eighth ACM SIGKDD Int'l Conf. Knowledge Discovery and Data Mining, 2002.
[4] N. Draper and H. Smith, Applied Regression Analysis. John Wiley&Sons, 1982.
[5] E. Frank, Y. Wang, S. Inglis, G. Holmes, and I. Witten, Using Model Trees for Classification Machine Learning, vol. 32, pp. 63-76, 1998.
[6] Z. Ghahramani, D. Wolpert, and M. Jordan, Generalization to Local Remapping of the Visuo-Motor Coordinate Transformation J. Neuroscience, 1996.
[7] J. Hurst, R. King, and M. Sternberg, Quantitative Structure-Activity Relationships by Neural Networks and Inductive Logic Programming. ii. The Inhibition of Dihydrofolate Reductase by Pyrimidines J. Computer-Aided Molecular Design, vol. 8, pp. 421-432, 1994.
[8] M. Jordan and R. Jacobs, Hierarchical Mixture of Experts and the EM Algorithms Neural Computation Neural Computation, vol. 6, pp. 181-214, 1994.
[9] A. Karalic, Linear Regression in Regression Tree Leaves Proc. Int'l School for Synthesis of Expert Knowledge, pp. 151-163, 1992.
[10] R. King, J. Hurst, and M. Sternberg, A Comparison of Artificial Intelligence Methods for Modelling Qsars Applied Artificial Intelligence, 1994.
[11] R. King, R.L.S. Muggleton, and M. Sternberg, Drug Design by Machine Learning: The Use of Inductive Logic Programming to Model the Structure-Activity Relationship of Trimephoprim Analogues Binding to Dihydrofolate Reductase Proc. Nat'l Academy of Sciences, vol. 89, pp. 11322-11326, 1992.
[12] W. Loh, Regression Trees with Unbiased Variable Selection and Interaction Detection Statistica Sinica, vol. 12, pp. 361-386, 2002.
[13] D. Lubinsky, Tree Structured Interpretable Regression Learning from Data, D. Fisher and H. Lenz, eds., vol. 112, pp. 387-398, 1994.
[14] D. Malerba, A. Appice, M. Ceci, and M. Monopoli, Trading-Off Local versus Global Effects of Regression Nodes in Model Trees Proc. Foundations of Intelligent Systems, 13th Int'l Symp., H.-S. Hacid, Z. Ras, D. Zighed, and Y. Kodratoff, eds., pp. 393-402, 2002.
[15] O.L. Mangasarian and D.R. Musicant, Robust Linear and Support Vector Regression IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 9, pp. 950-955, Sept. 2000.
[16] M. Mehta, R. Agrawal, and J. Rissanen, Sliq: A Fast Scalable Classifier for Data Mining Proc. Fifth Int'l Conf. Extending Database Technology, pp. 18-32, 1996.
[17] M. Orkin and R. Drogin, Vital Statistics. New York: McGraw-Hill, 1990.
[18] J.R. Quinlan, Learning with Continuous Classes Proc. Fifth Australian Joint Conf. Artificial Intelligence, Adams and Sterling, eds., pp. 343-348, 1992.
[19] M. Robnik-Sikonja and I. Kononenko, Pruning Regression Trees with MDL Proc. 13th European Conf. Artificial Intelligence, H. Prade, ed., pp. 455-459, 1998.
[20] L. Torgo, Functional Models for Regression Tree Leaves Proc. 14th Int'l Conf. Machine Learning, D. Fisher, ed., pp. 385-393, 1997.
[21] L. Torgo, Inductive Learning of Tree-Based Regression Models PhD dissertation, Dept. of Computer Science, Faculty of Sciences, Univ. of Porto, Portugal, 1999.
[22] L. Torgo, Computationally Efficient Linear Regression Trees Classification, Clustering and Data Analysis: Recent Advances and Applications (Proc. IFCS 2002), K. Jajuga et al., eds., 2002.
[23] Y. Wang and I. Witten, Inducing Model Trees for Continuous Classes Proc. Ninth European Conf. Machine Learning, M. van Someren and G. Widmer, eds., pp. 128-137, 1997.
[24] S. Weisberg, Applied Regression Analysis, second ed. New York: Wiley, 1985.

Index Terms:
Inductive learning, linear regression, model trees, global and local effects, regression and splitting nodes, SMOTI.
Citation:
Donato Malerba, Floriana Esposito, Michelangelo Ceci, Annalisa Appice, "Top-Down Induction of Model Trees with Regression and Splitting Nodes," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 26, no. 5, pp. 612-625, May 2004, doi:10.1109/TPAMI.2004.1273937
Usage of this product signifies your acceptance of the Terms of Use.