The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.07 - July (2010 vol.32)
pp: 1324-1328
Sarunas Raudys , Vilnius University, Vilnius
Aistis Raudys , Vilnius University, Vilnius
ABSTRACT
A novel loss function to train a net of K single-layer perceptrons (KSLPs) is suggested, where pairwise misclassification cost matrix can be incorporated directly. The complexity of the network remains the same; a gradient's computation of the loss function does not necessitate additional calculations. Minimization of the loss requires a smaller number of training epochs. Efficacy of cost-sensitive methods depends on the cost matrix, the overlap of the pattern classes, and sample sizes. Experiments with real-world pattern recognition (PR) tasks show that employment of novel loss function usually outperforms three benchmark methods.
INDEX TERMS
Cost-sensitive learning, loss function, pairwise classification, perceptron.
CITATION
Sarunas Raudys, Aistis Raudys, "Pairwise Costs in Multiclass Perceptrons", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.32, no. 7, pp. 1324-1328, July 2010, doi:10.1109/TPAMI.2010.72
REFERENCES
[1] R.O. Duda, P.E. Hart, and D.G. Stork, Pattern Classification and Scene Analysis, second ed. Wiley, 2000.
[2] C.M. Bishop, Pattern Recognition and Machine Learning. Springer-Verlag, 2006.
[3] S. Raudys, Statistical and Neural Classifiers: An Integrated Approach to Design. Springer-Verlag, 2001.
[4] S. Raudys, "Evolution and Generalization of a Single Neurone: I. SLP as Seven Statistical Classifiers," Neural Networks, vol. 11, pp. 283-296, 1998.
[5] S. Raudys and S. Amari, "Effect of Initial Values in Simple Perception," Proc. IEEE World Congress Computational Intelligence, pp. 1530-1535, May 1998.
[6] L. Breiman, J.H. Friedman, R.A. Olsen, and C.J. Stone, Classification and Regression Trees. Wadsworth, 1984.
[7] P. Domingos, "MetaCost: A General Method for Making Classifiers Cost-Sensitive," Proc. ACM SIGKDD, pp. 155-164, 1999.
[8] P. Geibel and F. Wysotzki, "Learning Perceptrons and Piecewise Linear Classifiers Sensitive to Example Dependent Costs," Applied Intelligence, vol. 21, no. 1, pp. 45-56, 2004.
[9] P. Geibel, U. Brefeld, and F. Wysotzki, "Perceptron and SVM Learning with Generalized Cost Models," Intelligent Data Analysis, vol. 8, no. 5, pp. 439-455, 2004.
[10] N. Abe, B. Zadrozny, and J. Langford, "An Iterative Method for Multi-Class Cost-Sensitive Learning," Proc. ACM SIGKDD, pp. 3-11, 2004.
[11] Z.-H. Zhou and X.-Y. Liu, "On Multi-Class Cost-Sensitive Learning," Proc. 21st Nat'l Conf. Artificial Intelligence, pp. 567-572, 2006.
[12] D.E. Rumelhart, G.E. Hinton, and R.J. Williams, "Learning Internal Representations by Error Propagation," Parallel Distributed Processing: Explorations in the Microstructure of Cognition, D.E. Rumelhart and J.L. McClelland, eds., vol. 1, pp. 318-362, Bradford, 1986.
[13] S. Amari, "A Theory of Adaptive Pattern Classifiers," IEEE Trans. Electronic Computers, vol. 16, pp. 299-307, 1967.
[14] R. Santos-Rodríguez, A. Guerrero-Curieses, R. Alaiz-Rodríguez, and J. Cid-Sueiro, "Cost-Sensitive Learning Based on Bregman Divergences," Machine Learning, vol. 76, pp. 271-285, 2009.
[15] E. Pekalska and R.P.W. Duin, "Dissimilarity Representations Allow for Building Good Classifiers," Pattern Recognition Letters, vol. 23, pp. 943-956, 2002.
[16] A. Asuncion and D.J. Newman, UCI Machine Learning Repository, http://www.ics.uci.edu/~mlearnMLRepository.html , 2007.
[17] M. Skurichina, S. Raudys, and R.P.W. Duin, "K-NN Directed Noise Injection in Multilayer Perceptron Training," IEEE Trans. Neural Networks, vol. 11, no. 2, pp. 504-511, Mar. 2000.
[18] "Machine Learning: A Technological Roadmap," L. Saitta, ed., technical report, Univ. of Amsterdam, 2000.
22 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool