
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
Marco Muselli, Diego Liberati, "Binary Rule Generation via Hamming Clustering," IEEE Transactions on Knowledge and Data Engineering, vol. 14, no. 6, pp. 12581268, November/December, 2002.  
BibTex  x  
@article{ 10.1109/TKDE.2002.1047766, author = {Marco Muselli and Diego Liberati}, title = {Binary Rule Generation via Hamming Clustering}, journal ={IEEE Transactions on Knowledge and Data Engineering}, volume = {14}, number = {6}, issn = {10414347}, year = {2002}, pages = {12581268}, doi = {http://doi.ieeecomputersociety.org/10.1109/TKDE.2002.1047766}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Knowledge and Data Engineering TI  Binary Rule Generation via Hamming Clustering IS  6 SN  10414347 SP1258 EP1268 EPD  12581268 A1  Marco Muselli, A1  Diego Liberati, PY  2002 KW  Rule generation KW  Hamming clustering KW  knowledge discovery KW  Boolean function approximation KW  generalization. VL  14 JA  IEEE Transactions on Knowledge and Data Engineering ER   
Abstract—The generation of a set of rules underlying a classification problem is performed by applying a new algorithm called Hamming Clustering (HC). It reconstructs the
[1] S.I. Gallant, Neural Networks Learning and Expert Systems. Cambridge, Mass.: MIT Press, 1993.
[2] D. Liberati, “Expert Systems: The State of the Art,” The Ligand Quarterly, vol. 8, pp. 606611, 1989.
[3] B. Buchanan and E. Shortliffe, RuleBased Expert Systems. Reading, Mass.: AddisonWesley, 1984.
[4] J. McDermott, “R1: The Formative Years,” AI Magazine, vol. 2, pp. 2129, 1981.
[5] R. Andrews, J. Diederich, and A. Tickle, “A Survey and Critique of Techniques for Extracting Rules from Trained Artificial Neural Networks,” KnowledgeBased Systems, vol. 8, pp. 373389, 1995.
[6] I.A. Taha and J. Ghosh, “Symbolic Interpretation of Artificial Neural Networks,” IEEE Trans. Knowledge and Data Eng., vol. 11, pp. 448463, 1999.
[7] R.M. Goodman, C.M. Higgins, J.W. Miller, and P. Smyth, “RuleBased Neural Networks for Classification and Probability Estimation,” Neural Computation, vol. 4, pp. 781804, 1992.
[8] K.P. Huber and M.R. Berthold, “Building Precise Classifiers with Automatic Rule Extraction,” Proc. IEEE Int'l Conf. Neural Networks, pp. III12631268, 1995.
[9] L.M. Fu, Neural Networks in Computer Intelligence.McGrawHill, 1994.
[10] G.G. Towell and J.W. Shavlik, "The Extraction of Refined Rules from KnowledgeBased Neural Networks," Machine Learning, vol. 13, no. 1, pp. 71101, 1993.
[11] R. Setiono and H. Liu, "Symbolic Representation of Neural Networks," Computer, pp. 7177, Mar. 1996.
[12] R. Setiono, ExtractingMofNRules from Trained Neural Networks IEEE Trans. Neural Networks, vol. 11, no. 2, pp. 512519, Mar. 2000.
[13] M. Ishikawa, “Rule Extraction by Successive Regularization,” Neural Networks, vol. 13, pp. 11711183, 2000.
[14] C.T. Lin and C.S.G. Lee, "NeuralNetworkBased Fuzzy Logic Control and Decision System," IEEE Trans. Computers, vol. 40, no. 12, pp. 1,3201,326, Dec. 1991.
[15] S. Horikawa, T. Furuhashi, and Y. Uchikawa, "On Fuzzy Modeling Using Fuzzy Neural Networks with BackPropagation Algorithm," IEEE Trans. Neural Networks, vol. 3, no. 5, pp. 801806, Sept. 1992.
[16] P.K. Simpson, Fuzzy MinMax Neural NetworksPart 2: Clustering IEEE Trans. Fuzzy Systems, vol. 1, no. 1, pp. 3245, Feb. 1993.
[17] M. Setnes, “Supervised Fuzzy Clustering for Rule Extraction,” IEEE Trans. Fuzzy Systems, vol. 8, pp. 416424, 2000.
[18] L. Breiman, J.H. Friedman, R.A. Olshen, and C.J. Stone, Classification and Regression Trees. Belmont: Wadsworth, 1994.
[19] J.R. Quinlan, C4.5: Programs for Machine Learning,San Mateo, Calif.: Morgan Kaufman, 1992.
[20] J.R. Quinlan, “Generating Production Rules from Decision Trees,” Proc. 10th Int'l Joint Conf. Artificial Intelligence, pp. 304307, 1987.
[21] G. Pagallo, “Learning DNF by Decision Trees,” Proc. 11th Int'l Joint Conf. Artificial Intelligence, pp. 639644, 1989.
[22] S. Muggleton, Inductive Logic Programming. New York: Academic Press, 1992.
[23] S. Muggleton and L. De Raedt, “Inductive Logic Programming: Theory and Methods,” J. Logic Programming, vol. 19/20, pp. 629679, 1994.
[24] J.R. Quinlan and R.M. CameronJones, “Induction of Logic Programs: Foil and Related Systems,” New Generation Computing, vol. 13, pp. 287312, 1995.
[25] P.R.J. van der Laag and S.H. NienhuysCheng, “Completness and Properness of Refinement Operators in Inductive Logic Programming,” J. Logic Programming, vol. 34, pp. 201225, 1998.
[26] J. Fürnkranz, “Pruning Algorithms for Rule Learning,” Machine Learning, vol. 27, pp. 139171, 1997.
[27] J.V. Jaskolski, “Construction of Neural Network Classification Expert Systems Using Switching Theory Algorithms,” Proc. Int'l Joint Conf. Neural Networks, pp. I16, 1992.
[28] S.J. Hong, "RMini: An Iterative Approach for Generating Minimal Rules From Examples," IEEE Trans. Knowledge and Data Engineering, vol. 9, pp. 709717, 1997.
[29] H.W. Gschwind and E.J. McCluskey, Design of Digital Computers. New York: SpringerVerlag, 1975.
[30] T. Downs and M.F. Schultz, Logic Design with Pascal. New York: Van Nostrand Reinhold, 1988.
[31] R.K. Brayton, G.D. Hachtel, C.T. McMullen, and A.L. SangiovanniVincintelli, Logic Minimization Algorithms for VLSI Synthesis.Boston: Kluwer Academic, 1984.
[32] S.J. Hong, R.G. Cain, and D.L. Ostapko, “MINI: A Heuristic Approach for Logic Minimization,” IBM J. Research and Development, vol. 18, pp. 443458, 1974.
[33] D.L. Dietmeyer,Logic Design of Digital Systems, 3rd ed. Boston: Allyn and Bacon, Inc., 1988.
[34] M. Muselli, “Predicting the Generalization Ability of Neural Networks Resembling the NearestNeighbor Algorithm,” Proc. Int'l Joint Conf. on Neural Networks (IJCNN 2000), pp. I2733, 2000.
[35] M. Muselli and D. Liberati, “Training Digital Circuits with Hamming Clustering,” IEEE Trans. Circuit and Systems—I: Fundamental Theory and Applications, vol. 47, pp. 513527, 2000.
[36] C. Mead, Analog VLSI and Neural Systems, AddisonWesley, Reading, Mass., 1989.
[37] F.N. Sibai and S.D. Kulkarni, “A TimeMultiplexed Reconfigurable Neuroprocessor,” IEEE Micro, vol. 17, pp. 5865, 1997.
[38] S.B. Thrun, J. Bala, E. Bloedorn, I. Bratko, B. Cestnik, K. De Jong, S. Dzeroski, S.E. Fahlman, D. Fisher, R. Hamann, K. Kaufman, S. Keller, I. Kononenko, J. Kreuziger, R.S. Michalski, T. Mitchell, P. Pachowicz, Y. Reich, H. Vafaie, W. Van de Welde, W. Wenzel, J. Wnek, and J. Zhang, “A Performance Comparison of Different Learning Algorithms,” Technical Report CMUCS91197,: Department of Computer Science, Carnegie Mellon University, Pittsburgh, Pennsylvania, 1991.
[39] Machine Learning, Neural, and Statistical Classification. D. Michie, D. Spiegelhalter, and C. Taylor, eds., London: EllisHorwood, 1994.
[40] C.J. Merz and P.M. Murphy, “UCI Repository of Machine Learning Databases” http://www.ics.uci.edu/~mlearnMLRepository.html , Irvine, Dept. of Information and Computer Science, Univ. of California, Irvine, 1996.
[41] O.L. Mangasarian and W.H. Wolberg, “Cancer Diagnosis via Linear Programming,” SIAM News, vol. 23, pp. 118, 1990.
[42] M.N. Murty, A.K. Jain, and P.J. Flynn, “Data Clustering: A Review,” ACM Computing Surveys, vol. 31, no. 3, pp. 264323, 1999.
[43] R.M. Gray, "Vector Quantization," IEEE Acoustics, Speech and Signal Processing, pp. 429, Apr. 1984.
[44] T. Kohonen, "SelfOrganization and Associated Memory," Berlin Heidelberg. New York: SpringerVerlag, 1988.
[45] B. Fritzke, “A Growing Neural Gas Network Learns Topologies,” Advances in Neural Information Processing Systems 7, G. Tesauro, D. S. Touretzky, and T. K. Leen, eds., Cambridge, MA: MIT Press, pp. 625632, 1995.
[46] G.A. Carpenter, S. Grossberg, and D.B. Rosen, “Fuzzy ART: Fast Stable Learning and Categorization of Analog Patterns by an Adaptive Resonance System,” Neural Networks, vol. 4, pp. 759–771, 1991.
[47] N.B. Karayiannis and J.C. Bezdek, “An Integrated Approach to Fuzzy Learning Vector Quantization and Fuzzy$\big. C{\hbox{}}{\rm{means}}\bigr.$Clustering,” IEEE Trans. Fuzzy Systems, vol. 5, pp. 622628, 1997.
[48] D.E. Rumelhart, G.E. Hinton, and R.J. Williams, "Learning Internal Representations by Error Propagation," Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vol. 1: Foundations, D.E. Rumelhart and J.L. McClelland et al., eds., chapter 8, pp. 318362.Cambridge, Mass.: MIT Press, 1986.
[49] S. Hong, "Use of Contextual Information for Feature Ranking and Discretization," IEEE Trans. Knowledge and Data Eng., vol. 9, no. 5, pp. 718730, Sept./Oct. 1997.
[50] W. Iba, J. Wogulis, and P. Langley, “Trading Off Simplicity and Coverage in Incremental Concept Learning,” Proc. Fifth Int'l Conf. Machine Learning, pp. 7379, Ann Arbor, Mich.: Morgan Kaufmann, 1988.
[51] S. Menet, P. SaintMarc, and G. Medioni, "Active Contour Models: Overview, Implementation and Applications," Int'l Conf. Systems, Man, and Cybernetics, vol. 212, pp. 194199, 1990.
[52] R. Setiono, “Generating Concise and Accurate Classification Rules for Breast Cancer Diagnosis,” Artificial Intelligence in Medicine, vol. 18, pp. 205219, 2000.
[53] G.P. Drago and S. Ridella, “Pruning with Interval Arithmetic Perceptron,” Neurocomputing, vol. 18, pp. 229246, 1998.
[54] J.M. Steppe and K.W. Bauer, “Improved Feature Screening in Feedforward Neural Networks,” Neurocomputing, vol. 13, pp. 4758, 1996.