This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Discriminant ECOC: A Heuristic Method for Application Dependent Design of Error Correcting Output Codes
June 2006 (vol. 28 no. 6)
pp. 1007-1012
We present a heuristic method for learning error correcting output codes matrices based on a hierarchical partition of the class space that maximizes a discriminative criterion. To achieve this goal, the optimal codeword separation is sacrificed in favor of a maximum class discrimination in the partitions. The creation of the hierarchical partition set is performed using a binary tree. As a result, a compact matrix with high discrimination power is obtained. Our method is validated using the UCI database and applied to a real problem, the classification of traffic sign images.

[1] K. Crammer and Y. Singer, “On the Learnability and Design of Output Codes for Multiclass Problems,” Machine Learning, vol. 47, no. 2-3, pp. 201-233, 2002.
[2] A. Passerini, M. Pontil, and P. Frasconi, “New Results on Error Correcting Codes of Kernel Machines,” IEEE Trans. Neural Networks, vol. 15, no. 1, pp. 45-54, 2004.
[3] V.N. Vapnik, The Nature of Statistical Learning Theory. Springer 1995.
[4] J.R. Quinlan, “Induction of Decision Trees,” Machine Learning, vol. 1, no. 1, pp. 81-106, 1986.
[5] Y. Freund and R.E. Shapire, “A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting,” J. Computer and System Sciences, vol. 55, no. 1, pp. 119-139, 1997.
[6] T. Hastie and R. Tibshirani, “Classification by Pairwise Coupling,” Annals of Statistics, vol. 26, no. 2, pp. 451-471, 1998.
[7] E.L Allwein, R.E Shapire, and Y. Singer, “Reducing Multiclass to Binary: A Unifying Approach for Margin Classifiers,” J. Machine Learning Research, vol. 1, pp. 113-141, 2000.
[8] T.G. Dietterich and G. Bakiri, “Solving Multiclass Learning Problems via Error-Correcting Output Codes,” J. Artificial Intelligence Research, vol. 2, pp. 263-286, 1995.
[9] P. Pudil, F. Ferri, J. Novovicová, and J. Kittler, “Floating Search Methods for Feature Selection with Nonmonotonic Criterion Functions,” Proc. Int'l Conf. Pattern Recognition, pp. 279-283, 1994.
[10] J.N. Kapur and H.K. Kesavan, Entropy Optimization Principles with Applications. London: Academic Press, 1992.
[11] J. Principe, D. Xu, and J. FisherIII, “Information Theoretic Learning,” Unsupervised Adaptive Filtering, Wiley, 2000.
[12] K. Torkkola, “Feature Extraction by Non-Parametric Mutual Information Maximization,” J. Machine Learning Research, vol. 3, pp. 1415-1438, 2003.
[13] J. Friedman, T. Hastie, and R. Tibshirani, “Additive Logistic Regression: A Statistical View of Boosting,” Annals of Statistics, vol. 28, no. 3, pp. 337-374, 2000.
[14] L. Kuncheva, Combining Pattern Classifiers. Methods and Algorithms. Wiley, 2004.
[15] R.E. Schapire, “Using Output Codes to Boost Multiclass Learning Problems,” Machine Learning: Proc. 14th Int'l Conf., pp. 313-321, 1997.
[16] C. Hsu and C. Lin, “A Comparison of Methods for Multi-Class Support Vector Machines,” IEEE Trans. Neural Networks, vol. 13, no. 2, pp. 415-425, Mar. 2002.
[17] P.M. Murphy and D.W. Aha, UCI Repository of Machine Learning Databases, Dept. of Information and Computer Science, Univ. of California, Irvine, 1994.

Index Terms:
Multiple classifiers, multiclass classification, visual object recognition.
Citation:
Oriol Pujol, Petia Radeva, Jordi Vitri?, "Discriminant ECOC: A Heuristic Method for Application Dependent Design of Error Correcting Output Codes," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 6, pp. 1007-1012, June 2006, doi:10.1109/TPAMI.2006.116
Usage of this product signifies your acceptance of the Terms of Use.