
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
Hai H. Dam, Hussein A. Abbass, Chris Lokan, Xin Yao, "NeuralBased Learning Classifier Systems," IEEE Transactions on Knowledge and Data Engineering, vol. 20, no. 1, pp. 2639, January, 2008.  
BibTex  x  
@article{ 10.1109/TKDE.2007.190671, author = {Hai H. Dam and Hussein A. Abbass and Chris Lokan and Xin Yao}, title = {NeuralBased Learning Classifier Systems}, journal ={IEEE Transactions on Knowledge and Data Engineering}, volume = {20}, number = {1}, issn = {10414347}, year = {2008}, pages = {2639}, doi = {http://doi.ieeecomputersociety.org/10.1109/TKDE.2007.190671}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Knowledge and Data Engineering TI  NeuralBased Learning Classifier Systems IS  1 SN  10414347 SP26 EP39 EPD  2639 A1  Hai H. Dam, A1  Hussein A. Abbass, A1  Chris Lokan, A1  Xin Yao, PY  2008 KW  Rulebased processing KW  Representations (procedural and rulebased) KW  Learning KW  Knowledge modeling VL  20 JA  IEEE Transactions on Knowledge and Data Engineering ER   
[1] H.A. Abbass, M. Towsey, and G.D. Finn, “CNet: A Method for Generating NonDeterministic and Dynamic Multivariate Decision Trees,” Knowledge and Information Systems, vol. 3, no. 2, pp.184197, 2001.
[2] E. BernadóMansilla and J.M. GarrellGuiu, “AccuracyBased Learning Classifier Systems: Models, Analysis, and Applications to Classification Tasks,” Evolutionary Computation, vol. 11, no. 3, pp. 209238, 2003.
[3] E. BernadóMansilla, X. Llorà, and J.M. GarrellGuiu, “XCS and GALE: A Comparative Study of Two Learning Classifier Systems with Six Other Learning Algorithms on Classification Tasks,” Proc. Fourth Int'l Workshop Learning Classifier Systems (IWLCS '01), short version published in Proc. Genetic and Evolutionary Computation Conf. (GECCO '01), pp. 337341, 2001.
[4] G. Brown, “Diversity in Neural Network Ensembles,” PhD dissertation, School of Computer Science, Univ. of Birmingham, 2004.
[5] G. Brown, J. Wyatt, R. Harris, and X. Yao, “Diversity Creation Methods: A Survey and Categorisation,” Information Fusion, vol. 6, no. 1, pp. 520, 2005.
[6] L. Bull, “On Using Constructivism in Neural Classifier Systems,” Parallel Problem Solving from Nature, 7, pp. 558567, 2002.
[7] L. Bull and T. O'Hara, “AccuracyBased Neuro and NeuroFuzzy Classifier Systems,” Proc. Genetic and Evolutionary Computation Conf. (GECCO '02), pp. 905911, 2002.
[8] M.V. Butz, “RuleBased Evolutionary Online Learning Systems: Learning Bounds, Classification, and Prediction,” PhD dissertation, Univ. of Illinois at Urbana–Champaign, 2004.
[9] M.V. Butz, “KernelBased, Ellipsoidal Conditions in the RealValued XCS Classifier System,” Proc. Conf. Genetic and Evolutionary Computation (GECCO '05), pp. 18351842, 2005.
[10] H.H. Dam, H.A. Abbass, and C. Lokan, “Be Real! XCS with Continuous Valued Inputs,” Proc. Eighth Int'l Workshop Learning Classifier Systems (IWLCS '05), 2005.
[11] P. Duell, I. Fermin, and X. Yao, “Speciation Techniques in Evolved Ensembles with Negative Correlation Learning,” IEEE Congress on Evolutionary Computation, pp. 1621, 2006.
[12] D.E. Goldberg, Genetic Algorithms in Search, Optimization, and Machine Learning. AddisonWesley, 1989.
[13] S. Haykin, Neural Networks, a Comprehensive Foundation, second ed. Prentice Hall, 1999.
[14] J.H. Holland, Adaptation in Natural and Artificial Systems. Univ. of Michigan Press, 1975, republished by the MIT Press, 1992.
[15] J.H. Holland, “Properties of the Bucket Brigade,” Proc. First Int'l Conf. Genetic Algorithms, pp. 17, 1985.
[16] J.H. Holland, “Escaping Brittleness: The Possibilities of GeneralPurpose Learning Algorithms Applied to Parallel RuleBased Systems,” Machine Learning, an Artificial Intelligence Approach, Mitchell, Michalski, and Carbonell, eds., vol.2, chapter 20, pp.593623. Morgan Kaufmann, 1986.
[17] K. Hornik, M. Stinchcombe, and H. White, “Multilayer Feedforward Networks Are Universal Approximators,” Neural Network, vol. 2, no. 5, pp. 359366, 1989.
[18] J. Hurst and L. Bull, “A SelfAdaptive Neural Learning Classifier System with Constructivism for Mobile Robot Control,” Parallel Problem Solving from Nature–PPSN VIII, pp. 942951, Springer, 2004.
[19] M.M. Islam, X. Yao, and K. Murase, “A Constructive Algorithm for Training Cooperative Neural Network Ensembles,” IEEE Trans. Neural Networks, vol. 14, no. 4, pp. 820834, 2003.
[20] T. Kovacs, “What Should a Classifier System Learn and How Should We Measure It,” J. Soft Computing, vol. 6, nos. 34, pp. 171182, June 2002.
[21] P.L. Lanzi and A. Perrucci, “Extending the Representation of Classifier Conditions Part I: From Messy Coding to SExpressions,” Proc. Genetic and Evolutionary Computation Conf., W.Banzhaf, J.Daida, A.E. Eiben, M.H. Garzon, V. Honavar, M.Jakiela, and R.E. Smith, eds., vol. 1, pp. 337344, 1999.
[22] P.L. Lanzi and A. Perrucci, “Extending the Representation of Classifier Conditions Part II: From Messy Coding to SExpressions,” Proc. Genetic and Evolutionary Computation Conf., W.Banzhaf, J. Daida, A.E. Eiben, M.H. Garzon, V. Honavar, M.Jakiela, and R.E. Smith, eds., vol. 1, pp. 345352, 1999.
[23] W.P. Lincoln and J. Skrzypek, “Synergy of Clustering Multiple Back Propagation Networks,” Advances in Neural Information Processing Systems 2, pp. 650659, 1990.
[24] Y. Liu, “Negative Correlation Learning and Evolutionary Design of Neural Network Ensembles,” PhD dissertation, Univ. College, Univ. of New South Wales, Australian Defence Force Academy, 1999.
[25] Y. Liu and X. Yao, “Negatively Correlated Neural Networks Can Produce Best Ensembles,” Australian J. Intelligent Information Processing Systems, vol. 4, no. 3/4, pp. 176185, 1997.
[26] Y. Liu, X. Yao, and T. Higuchi, “Evolutionary Ensembles with Negative Correlation Learning,” IEEE Trans. Evolutionary Computation, vol. 4, no. 4, pp. 380387, 2000.
[27] R. McKay and H. Abbass, “AntiCorrelation: A Diversity Promoting Mechanism in Ensemble Learning,” Australian J. Intelligence Information Processing Systems, vol. 7, no. 3/4, pp. 139149, 2001.
[28] D. Newman, S. Hettich, C. Blake, and C. Merz, “UCI Repository of Machine Learning Databases,” Dept. of Information and Computer Science, Univ. of California at Irvine, http://www.ics. uci.edu/~mlearnMLRepository.html , 1998.
[29] N.J. Nilsson, Learning Machines: Foundations of Trainable PatternClassifying Systems. McGraw Hill, 1965.
[30] S. Quartz and T. Sejnowski, “The Neural Basis of Cognitive Development: A Constructivist Manifesto,” Brain and Behavioral Sciences, vol. 20, no. 4, pp. 537596, 1997.
[31] K. Shafi, H. Abbass, and W. Zhu, “The Role of Early Stopping and Population Size in XCS for Intrusion Detection,” Proc. Sixth Int'l Conf. Simulated Evolution and Learning (SEAL '06), pp. 5057, 2006.
[32] A. Sharkey, “On Combining Artificial Neural Nets,” Connection Science, vol. 8, pp. 299313, 1996.
[33] M. Skurichina, L. Kuncheva, and R. Duin, “Bagging and Boosting for the Nearest Mean Classifier: Effects of Sample Size on Diversity and Accuracy,” Proc. Int'l Workshop Multiple Classifier Systems, pp. 6271, 2002.
[34] S.F. Smith, “A Learning System Based on Genetic Adaptive Algorithms,” PhD dissertation, Univ. of Pittsburgh, 1980.
[35] S.W. Wilson, “Classifier Fitness Based on Accuracy,” Evolutionary Computation, vol. 3, no. 2, pp. 149175, 1995.
[36] S.W. Wilson, “Generalization in the XCS Classifier System,” Proc. Third Ann. Conf. Genetic Programming, J.R. Koza, W. Banzhaf, K.Chellapilla, K. Deb, M. Dorigo, D.B. Fogel, M.H. Garzon, D.E.Goldberg, H. Iba, and R. Riolo, eds., pp. 665674, 1998.
[37] S.W. Wilson, “Get Real! XCS with ContinuousValued Inputs,” Learning Classifier Systems, from Foundations to Applications, P.Lanzi, W. Stolzmann, and S. Wilson, eds., LNAI 1813, pp.209219, Springer, 2000.
[38] S.W. Wilson, “Mining Oblique Data with XCS,” Proc. Third Int'l Workshop (IWLCS '00), P.L. Lanzi, W. Stolzmann, and S.W.Wilson,eds., pp. 158174, 2001.