The Community for Technology Leaders
RSS Icon
Issue No.08 - August (2011 vol.33)
pp: 1659-1672
Raúl Fidalgo-Merino , Universidad de Málaga, Málaga
Marlon Núñez , University of Malaga, Malaga
A new algorithm for incremental construction of binary regression trees is presented. This algorithm, called SAIRT, adapts the induced model when facing data streams involving unknown dynamics, like gradual and abrupt function drift, changes in certain regions of the function, noise, and virtual drift. It also handles both symbolic and numeric attributes. The proposed algorithm can automatically adapt its internal parameters and model structure to obtain new patterns, depending on the current dynamics of the data stream. SAIRT can monitor the usefulness of nodes and can forget examples from selected regions, storing the remaining ones in local windows associated to the leaves of the tree. On these conditions, current regression methods need a careful configuration depending on the dynamics of the problem. Experimentation suggests that the proposed algorithm obtains better results than current algorithms when dealing with data streams that involve changes with different speeds, noise levels, sampling distribution of examples, and partial or complete changes of the underlying function.
Machine learning, mining methods and algorithms, knowledge acquisition, heuristics design.
Raúl Fidalgo-Merino, Marlon Núñez, "Self-Adaptive Induction of Regression Trees", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.33, no. 8, pp. 1659-1672, August 2011, doi:10.1109/TPAMI.2011.19
[1] D.H. Widyantoro, T.R. Ioerger, and J. Yen, "An Adaptive Algorithm for Learning Changes in User Interests," Proc. Eighth Int'l Conf. Information and Knowledge Management, pp. 405-412, 1999.
[2] W. Fan, "Systematic Data Selection to Mine Concept-Drifting Data Streams," Proc. ACM SIGKDD, pp. 128-137, 2004.
[3] M. Núñez, R. Fidalgo, and R. Morales, "Learning in Environments with Unknown Dynamics: Towards More Robust Concept Learners," J. Machine Learning Research, vol. 8, pp. 2595-2628, 2007.
[4] P. Domingos and G. Hulten, "Mining High-Speed Data Streams," Proc. Sixth ACM SIGKDD, pp. 71-80, 2000.
[5] M.A. Maloof and R.S. Michalski, "Selecting Examples for Partial Memory Learning," Machine Learning, vol. 41, no. 1, pp. 27-52, 2000.
[6] E. Ikonomovska and J. Gama, "Learning Model Trees from Data Streams," Proc. Int'l Conf. Discovery Science, J.F. Boulicaut, M.R. Berthold, and T. Horváth, eds., pp. 52-63, 2008.
[7] E. Ikonomovska, J. Gama, R. SebastiÃo, and D. Gjorgjevik, "Regression Trees from Data Streams with Drift Detection," Proc. Int'l Conf. Discovery Science, pp. 121-135, 2009.
[8] D. Potts and C. Sammut, "Incremental Learning of Linear Model Trees," Machine Learning, vol. 61, pp. 5-48, 2005.
[9] J.R. Quinlan, "Learning with Continuous Classes," Proc. Fifth Australian Joint Conf. Artificial Intelligence, pp. 307-310, 1992.
[10] L. Breiman, J. Friedman, C.J. Stone, and R. Olsen, Classification and Regression Trees. Wadsworth Publishing Company, 1984.
[11] D. Aha, D. Kibler, and M. Albert, "Instance-Based Learning Algorithms," Machine Learning, vol. 6, pp. 37-66, 1991.
[12] J.Z. Kolter and M.A. Maloof, "Using Additive Expert Ensembles to Cope with Concept Drift," Proc. 22nd Int'l Conf. Machine Learning, pp. 449-456, 2005.
[13] F. Rosenthal, P.B. Volk, M. Hahmann, D. Habich, and W. Lehner, "Drift-Aware Ensemble Regression," Proc. Int'l Conf. Machine Learning and Data Mining, pp. 221-235, 2009.
[14] G. Widmer and M. Kubat, "Learning in the Presence of Concept Drift and Hidden Contexts," Machine Learning, vol. 23, pp. 69-101, 1996.
[15] M. Kubat and G. Widmer, "Adapting to Drift in Continuous Domains (Extended Abstract)," Proc. Eighth European Conf. Machine Learning, pp. 307-310, 1995.
[16] G. Hulten, L. Spencer, and P. Domingos, "Mining Time-Changing Data Streams," Proc. Seventh ACM SIGKDD, pp. 97-106, 2001.
[17] R. Klinkenberg and T. Joachims, "Detecting Concept Drift with Support Vector Machines," Proc. 17th Int'l Conf. Machine Learning, pp. 487-494, 2000.
[18] R. Klinkenberg, "Learning Drifting Concepts: Example Selection versus Example Weighting," Intelligent Data Analysis, vol. 8, no. 3, pp. 281-300, 2004.
[19] M. Núñez, R. Fidalgo, and R. Morales, "On-Line Learning of Decision Trees in Problems with Unknown Dynamics," Proc. Fourth Mexican Int'l Conf. Artificial Intelligence, pp. 443-453, 2005.
[20] J.B. MacQueen, "Some Methods for Classification and Analysis of Multivariate Observations," Proc. Fifth Berkeley Symp. Math. Statistics and Probability, vol. 1, pp. 281-297, 1967.
[21] J. Dougherty, R. Kohavi, and M. Sahami, "Supervised and Unsupervised Discretization of Continuous Features," Proc. 20th Int'l Conf. Machine Learning, pp. 194-202, 1995.
[22] J.W. Tukey, Exploratory Data Analysis. Addison-Wesley, 1977.
[23] J. Nagle, "On Packets Switches with Infinite Storage," IEEE Trans. Comm., vol. 35, no. 4, pp. 435-438, Apr. 1987.
[24] J. Postel, "RFC-793: TCP Specification," ARPANET WGRC, DDN Network Information Center, SRI Int'l, 1981.
[25] V. Paxson and M. Allman, "RFC-2988: Computing TCPs Transmission Timer," Network WGRC, 2000.
[26] F. Esposito, D. Malerba, and G. Semeraro, "A Comparative Analysis of Methods for Pruning Decision Trees," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 19, no. 5, pp. 476-491, May 1997.
[27] J.R. Quinlan, "Induction of Decision Trees," Machine Learning, vol. 1, pp. 81-106, 1986.
[28] W. Hoeffding, "Probability Inequalities for Sums of Bounded Random Variables," J. Am. Statistical Assoc., vol. 58, pp. 13-30, 1963.
[29] I.H. Witten and E. Frank, Data Mining: Practical Machine Learning Tools and Techniques, second ed. Morgan Kaufmann, 2005.
[30] M. Harries, C. Sammut, and K. Horn, "Extracting Hidden Context," Machine Learning, vol. 32, no. 2, pp. 101-126, 1998.
[31] J. Gama, P. Medas, and P. Rodrigues, "Learning in Dynamic Environments: Decision Trees for Data Streams," Proc. Fourth Int'l Workshop Pattern Recognition in Information Systems, pp. 149-158, 2004.
[32] A. Asuncion and D.J. Newman, "UCI Machine Learning Repository," , 2007.
[33] C.E. Rasmussen, R.M. Neal, and G. Hinton, "DELVE Dataset Repository," Univ. of Toronto, , 2003.
[34] P. Vlachos, "StatLib Repository," Carnegie Mellon Univ., Dept. of Statistics, http://lib.stat.cmu.edudatasets/, 2007.
[35] T.G. Dietterich, "Approximate Statistical Tests for Comparing Supervised Classification Learning Algorithms," Neural Computation, vol. 10, no. 7, pp. 1895-1923, 1998.
[36] J. Demšar, "Statistical Comparisons of Classifiers over Multiple Data Sets," J. Machine Learning Research, vol. 7, pp. 1-30, 2006.
12 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool