
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
David Martens, Bart Baesens, Tony Van Gestel, "Decompositional Rule Extraction from Support Vector Machines by Active Learning," IEEE Transactions on Knowledge and Data Engineering, vol. 21, no. 2, pp. 178191, February, 2009.  
BibTex  x  
@article{ 10.1109/TKDE.2008.131, author = {David Martens and Bart Baesens and Tony Van Gestel}, title = {Decompositional Rule Extraction from Support Vector Machines by Active Learning}, journal ={IEEE Transactions on Knowledge and Data Engineering}, volume = {21}, number = {2}, issn = {10414347}, year = {2009}, pages = {178191}, doi = {http://doi.ieeecomputersociety.org/10.1109/TKDE.2008.131}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Knowledge and Data Engineering TI  Decompositional Rule Extraction from Support Vector Machines by Active Learning IS  2 SN  10414347 SP178 EP191 EPD  178191 A1  David Martens, A1  Bart Baesens, A1  Tony Van Gestel, PY  2009 KW  Support vector machine KW  rule extraction KW  active learning KW  black box models KW  ALBA. VL  21 JA  IEEE Transactions on Knowledge and Data Engineering ER   
[1] B. Baesens, T. Van Gestel, S. Viaene, M. Stepanova, J. Suykens, and J. Vanthienen, “Benchmarking StateoftheArt Classification Algorithms for Credit Scoring,” J. Operational Research Soc., vol. 54, no. 6, pp. 627635, 2003.
[2] M. Pazzani, S. Mani, and W. Shankle, “Acceptance by Medical Experts of Rules Generated by Machine Learning,” Methods of Information in Medicine, vol. 40, no. 5, pp. 380385, 2001.
[3] D. Martens, L. Bruynseels, B. Baesens, M. Willekens, and J. Vanthienen, “Predicting Going Concern Opinion with Data Mining,” Decision Support Systems, vol. 45, pp. 765777, 2008.
[4] R. Andrews, J. Diederich, and A. Tickle, “Survey and Critique of Techniques for Extracting Rules from Trained Artificial Neural Networks,” Knowledge Based Systems, vol. 8, no. 6, pp. 373389, 1995.
[5] B. Baesens, R. Setiono, C. Mues, and J. Vanthienen, “Using Neural Network Rule Extraction and Decision Tables for CreditRisk Evaluation,” Management Science, vol. 49, no. 3, pp. 312329, 2003.
[6] M. Craven and J. Shavlik, “Extracting TreeStructured Representations of Trained Networks,” Advances in Neural Information Processing Systems, vol. 8, D. Touretzky, M. Mozer, and M. Hasselmo, eds., pp. 2430, The MIT Press, citeseer.ist. psu.educraven96extracting.html , 1996.
[7] M. Craven, “Extracting Comprehensible Models from Trained Neural Networks,” PhD dissertation, Dept. of Computer Sciences, Univ. of WisconsinMadison, 1996.
[8] V.N. Vapnik, The Nature of Statistical Learning Theory. SpringerVerlag Inc., 1995.
[9] N. Cristianini and J. ShaweTaylor, An Introduction to Support Vector Machines and Other KernelBased Learning Methods. Cambridge Univ. Press, 2000.
[10] U. Johansson, R. König, and L. Niklasson, “The Truth Is in There—Rule Extraction from Opaque Models Using Genetic Programming,” Proc. 17th Int'l Florida AI Research Symp. Conf. (FLAIRS), 2004.
[11] D. Martens, B. Baesens, T. Van Gestel, and J. Vanthienen, “Comprehensible Credit Scoring Models Using Rule Extraction from Support Vector Machines,” European J. Operational Research, vol. 183, no. 3, pp. 14661476, 2007.
[12] B.D. Ripley, “Neural Networks and Related Methods for Classification,” J. Royal Statistical Soc. B, vol. 56, pp. 409456, 1994.
[13] J. Huysmans, B. Baesens, and J. Vanthienen, “Using Rule Extraction to Improve the Comprehensibility of Predictive Models,” K.U.Leuven KBI, Research 0612, 2006.
[14] T. Van Gestel, J. Suykens, B. Baesens, S. Viaene, J. Vanthienen, G. Dedene, B. De Moor, and J. Vandewalle, “Benchmarking Least Squares Support Vector Machine Classifiers,” Machine Learning, vol. 54, no. 1, pp. 532, 2004.
[15] C. Bishop, Neural Networks for Pattern Recognition. Oxford Univ. Press, 1996.
[16] J. Suykens, T. Van Gestel, J. De Brabanter, B. De Moor, and J. Vandewalle, Least Squares Support Vector Machines. World Scientific, 2002.
[17] N. Barakat and J. Diederich, “Eclectic RuleExtraction from Support Vector Machines,” Int'l J. Computational Intelligence, vol. 2, no. 1, pp. 5962, 2005.
[18] D. Martens, J. Huysmans, R. Setiono, J. Vanthienen, and B. Baesens, “Rule Extraction from Support Vector Machines: An Overview of Issues and Application in Credit Scoring,” Rule Extraction from Support Vector Machines, ser. Studies in Computational Intelligence, vol. 80, chapter 2, pp. 3363, Springer, 2008.
[19] H. Nùñez, C. Angulo, and A. Català, “Rule Extraction from Support Vector Machines,” Proc. European Symp. Artificial Neural Networks (ESANN '02), pp. 107112, 2002.
[20] G. Fung, S. Sandilya, and R. Rao, “Rule Extraction from Linear Support Vector Machines,” Proc. 11th ACM SIGKDD Int'l Conf. Knowledge Discovery in Data Mining (KDD '05), pp. 3240, 2005.
[21] J. Huysmans, B. Baesens, and J. Vanthienen, “ITER: An Algorithm for Predictive Regression Rule Extraction,” Proc. Eighth Int'l Conf. Data Warehousing and Knowledge Discovery (DaWaK '06), vol. 4081, pp. 270279, Springer Verlag, 2006.
[22] N. Barakat and A. Bradley, “Rule Extraction from Support Vector Machines: A Sequential Covering Approach,” IEEE Trans. Knowledge and Data Eng., vol. 19, no. 6, pp. 729741, June 2007.
[23] L. Breiman, J. Friedman, R. Olsen, and C. Stone, Classification and Regression Trees. Wadsworth and Brooks, 1984.
[24] P. Clark and T. Niblett, “The CN2 Induction Algorithm,” Machine Learning, vol. 3, no. 4, pp. 261283, 1989.
[25] J. Quinlan, C4.5 Programs for Machine Learning. Morgan Kaufmann, 1993.
[26] I. Taha and J. Ghosh, “Symbolic Interpretation of Artificial Neural Networks,” IEEE Trans. Knowledge and Data Eng., vol. 11, no. 3, pp. 448463, May/June 1999.
[27] G. Schmitz, C. Aldrich, and F. Gouws, “ANNDT: An Algorithm for Extraction of Decision Trees from Artificial Neural Networks,” IEEE Trans. Neural Networks, vol. 10, no. 6, pp. 13921401, 1999.
[28] O. Boz, “Converting a Trained Neural Network to a Decision Tree. Dectext—Decision Tree Extractor,” PhD dissertation, Dept. of Computer Science and Eng., Lehigh Univ., citeseer.ist.psu.eduboz00converting.html , 2000.
[29] Z.H. Zhou, Y. Jiang, and S.F. Chen, “Extracting Symbolic Rules from Trained Neural Network Ensembles,” AI Comm., vol. 16, no. 1, pp. 315, 2003.
[30] U. Johansson, R. König, and L. Niklasson, “Rule Extraction from Trained Neural Networks Using Genetic Programming,” Proc. Joint 13th Int'l Conf. Artificial Neural Networks and 10th Int'l Conf. Neural Information Processing (ICANN/ICONIP '03), pp. 1316, 2003.
[31] U. MarkowskaKaczmar and W. Trelak, “Extraction of Fuzzy Rules from Trained Neural Network Using Evolutionary Algorithm,” Proc. European Symp. Artificial Neural Networks (ESANN '03), pp. 149154, 2003.
[32] U. MarkowskaKaczmar and M. Chumieja, “Discovering the Mysteries of Neural Networks,” Int'l J. Hybrid Intelligent Systems, vol. 1, nos. 34, pp. 153163, 2004.
[33] J. Rabuñal, J. Dorado, A. Pazos, J. Pereira, and D. Rivero, “A New Approach to the Extraction of ANN Rules and to Their Generalization Capacity through GP,” Neural Computation, vol. 16, no. 47, pp. 14831523, 2004.
[34] F. Chen, “Learning Accurate and Understandable Rules from SVM Classifiers,” master's thesis, Simon Fraser Univ., 2004.
[35] R. Setiono, B. Baesens, and C. Mues, “Risk Management and Regulatory Compliance: A Data Mining Framework Based on Neural Network Rule Extraction,” Proc. Int'l Conf. Information Systems (ICIS), 2006.
[36] D. Martens, M. De Backer, R. Haesen, M. Snoeck, J. Vanthienen, and B. Baesens, “Classification with Ant Colony Optimization,” IEEE Trans. Evolutionary Computation, vol. 11, no. 5, pp. 651665, 2007.
[37] J.R. Quinlan, C4.5 Programs for Machine Learning. Morgan Kaufmann, 1993.
[38] W.W. Cohen, “Fast Effective Rule Induction,” Proc. 12th Int'l Conf. Machine Learning (ICML '95), A. Prieditis and S. Russell, eds., pp.115123, 1995.
[39] P.N. Tan, M. Steinbach, and V. Kumar, Introduction to Data Mining. Addison Wesley, 2005.
[40] I.H. Witten and E. Frank, Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations. Morgan Kaufmann, 2000.
[41] N. Barakat and J. Diederich, “LearningBased RuleExtraction from Support Vector Machines,” Proc. 14th Int'l Conf. Computer Theory and Applications (ICCTA), 2004.
[42] M. Mannino and M. Koushik, “The CostMinimizing Inverse Classification Problem: A Genetic Algorithm Approach,” Decision Support Systems, vol. 29, pp. 283300, 2000.
[43] T. Van Gestel, B. Baesens, P.V. Dijcke, J. Suykens, J. Garcia, and T. Alderweireld, “Linear and NonLinear Credit Scoring by Combining Logistic Regression and Support Vector Machines,” J.Credit Risk, vol. 1, no. 4, 2006.
[44] T. Van Gestel, D. Martens, B. Baesens, D. Feremans, J. Huysmans, and J. Vanthienen, “Forecasting and Analyzing Insurance Companies' Ratings,” Int'l J. Forecasting, vol. 23, no. 3, pp. 513529, 2007.
[45] D. Cohn, L. Atlas, and R. Ladner, “Improving Generalization with Active Learning,” Machine Learning, vol. 15, no. 2, pp. 201221, 1994.
[46] T. Downs, K. Gates, and A. Masters, “Exact Simplification of Support Vector Solutions,” J. Machine Learning Research, vol. 2, pp.293297, 2001.
[47] M. Tipping, “Sparse Bayesian Learning and the Relevance Vector Machine,” J. Machine Learning Research, vol. 1, pp. 211244, citeseer.ist.psu.edutipping01sparse.html , 2001.
[48] S. Hettich and S.D. Bay, The UCI KDD Archive, Dept. of Information and Computer Science, Univ. of California, http:/kdd.ics.uci.edu, 1996.
[49] C.W. Hsu and C.J. Lin, “A Comparison of Methods for MultiClass Support Vector Machines,” IEEE Trans. Neural Networks, vol. 13, pp. 415425, 2002.
[50] T.G. Dietterich, “Approximate Statistical Test for Comparing Supervised Classification Learning Algorithms,” Neural Computation, vol. 10, no. 7, pp. 18951923, 1998.
[51] N. Barakat and A. Bradley, “Rule Extraction from Support Vector Machines: Measuring the Explanation Capability Using the Area Under the ROC Curve,” Proc. 18th Int'l Conf. Pattern Recognition (ICPR '06), vol. 2, pp. 812815, 2006.
[52] T. Fawcett, “Prie: A System for Generating Rulelists to Maximize ROC Performance,” Data Mining and Knowledge Discovery, vol. 17, no. 2, pp. 207224, 2008.
[53] M. SaarTsechansky and F. Provost, “DecisionCentric Active Learning of BinaryOutcome Models,” Information Systems Research, vol. 18, no. 1, pp. 422, 2007.
[54] Credit Scoring and Its Applications, L. Thomas, D. Edelman, and J.Crook, eds. SIAM, 2002.