The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.06 - June (2010 vol.22)
pp: 906-910
Jiuyong Li , University of South Australia, Adelaide
Desheng Dash Wu , Reykjavik University, Reykjavik and University of Toronto, Toronto
Bing-Yu Sun , Chinese Academy of Sciences, Hefei
Wen-Bo Li , Chinese Academy of Sciences, Hefei
ABSTRACT
Ordinal regression has wide applications in many domains where the human evaluation plays a major role. Most current ordinal regression methods are based on Support Vector Machines (SVM) and suffer from the problems of ignoring the global information of the data and the high computational complexity. Linear Discriminant Analysis (LDA) and its kernel version, Kernel Discriminant Analysis (KDA), take into consideration the global information of the data together with the distribution of the classes for classification, but they have not been utilized for ordinal regression yet. In this paper, we propose a novel regression method by extending the Kernel Discriminant Learning using a rank constraint. The proposed algorithm is very efficient since the computational complexity is significantly lower than other ordinal regression methods. We demonstrate experimentally that the proposed method is capable of preserving the rank of data classes in a projected data space. In comparison to other benchmark ordinal regression methods, the proposed method is competitive in accuracy.
INDEX TERMS
Ordinal regression, linear discriminant analysis, kernel discriminant analysis.
CITATION
Jiuyong Li, Desheng Dash Wu, Bing-Yu Sun, Wen-Bo Li, "Kernel Discriminant Learning for Ordinal Regression", IEEE Transactions on Knowledge & Data Engineering, vol.22, no. 6, pp. 906-910, June 2010, doi:10.1109/TKDE.2009.170
REFERENCES
[1] S. Kramer, G. Widmer, B. Pfahringer, and M. De Groeve, "Prediction of Ordinal Classes Using Regression Trees," Fundamenta Informaticae, vol. 47, nos. 1/2, pp. 1-13, 2001.
[2] R.H. Erbrich, T. Graepel, and K. Obermayer, "Large Margin Rank Boundaries for Ordinal Regression," Advances in Large Margin Classifiers, pp. 115-132, MIT Press, 2000.
[3] K. Crammer and Y. Singer, "Pranking with Ranking," Advances in Neural Information Processing Systems, T.G. Dietterich, S. Becker, and Z. Ghahramani, eds., vol. 1, no. 14, pp. 641-647, MIT Press, 2002.
[4] A. Shashua and A. Levin, "Ranking with Large Margin Principle: Two Approaches," Advances in Neural Information Processing Systems, vol. 15, pp. 961-968, MIT Press, 2003.
[5] W. Chu and S.S. Keerthi, "New Approaches to Support Vector Ordinal Regression," Proc. 22nd Int'l Conf. Machine Learning (ICML '05), pp. 145-152, 2005.
[6] L. Lin and H.-T. Lin, "Ordinal Regression by Extended Binary Classification," Advances in Neural Information Processing Systems, vol. 19, pp. 865-872, MIT Press, 2007.
[7] J.S. Cardoso and J.F. Pinto da Costa, "Learning to Classify Ordinal Data: The Data Replication Method," J. Machine Learning Research, vol. 8, pp. 1393-1429, 2007.
[8] C.M. Bishop, Pattern Recognition and Machine Learning. Springer, 2006.
[9] R.O. Duda, P.E. Hart, and D. Stork, Pattern Classification. Wiley, 2000.
[10] S. Mika, "Kernel Fisher Discriminants," PhD thesis, Univ. of Tech nology, 2002.
[11] P.N. Belhumeour, J.P. Hespanha, and D.J. Kriegman, "Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 19, no. 7, pp. 711-720, July 1997.
[12] M.A. Turk and A.P. Pentland, "Face Recognition Using Eigenfaces," Proc. IEEE Conf. Computer Vision and Pattern Recognition, pp. 586-591, 1991.
[13] X. He, S. Yan, Y. Hu, P. Niyogi, and H. Zhang, "Face Recognition Using Laplacian Faces," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 27, no. 3, pp. 328-340, Mar. 2005.
[14] S. Yan, D. Xu, B. Zhang, H. Zhang, Q. Yang, and S. Lin, "Graph Embedding and Extensions: A General Framework for Dimensionality Reduction," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 29, no. 1, pp. 40-51, Jan. 2007.
[15] J. Lu, K.N. Plataniotis, and A.N. Venetsanopoulos, "Face Recognition Using Kernel Direct Discriminant Analysis Algorithms," IEEE Trans. Neural Networks, vol. 14, no. 1, pp. 117-126, Jan. 2003.
[16] J. Ye, R. Janardan, and Q. Li, "Two-Dimensional Linear Discriminant Analysis," Advances in Neural Information Processing Systems, pp. 1569-1576, MIT Press, 2004.
[17] D. Cai, X. He, and J. Han, "SRDA: An Efficient Algorithm for Large-Scale Discriminant Analysis," IEEE Trans. Knowledge and Data Eng., vol. 20, no. 1, pp. 1-12, Jan. 2008.
[18] Y. Guo, T. Hastie, and R. Tibshirani, "Regularized Linear Discriminant Analysis and Its Application in Microarrays," Biostatistics, vol. 8, no. 1, pp. 86-100, 2007.
[19] H. Kim, B. Drake, and H. Park, "Adaptive Nonlinear Discriminant Analysis by Regularized Minimum Squared Errors," IEEE Trans. Knowledge and Data Eng., vol. 18, no. 5, pp. 603-612, May 2006.
[20] V. Vapnik, The Nature of Statistical Learning Theory. Wiley, 1998.
[21] K.-R. Muller, S. Mika, G. Ratsch, K. Tsuda, and B. Scholkopf, "An Introduction to Kernel-Based Learning Algorithms," IEEE Trans. Neural Networks, vol. 12, no. 2, pp. 181-201, Mar. 2001.
[22] F.B. Francis and M.I. Jordan, "Predictive Low-Rank Decomposition for Kernel Methods," Proc. 22nd Int'l Conf. Machine learning (ICML '05), pp. 33-40, 2005.
7 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool