The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.09 - September (2008 vol.30)
pp: 1672-1680
Nojun Kwak , Ajou University, Suwon
ABSTRACT
A method of principal component analysis (PCA) based on a new L1-norm optimization technique is proposed. Unlike conventional PCA which is based on L2-norm, the proposed method is robust to outliers because it utilizes L1-norm which is less sensitive to outliers. It is invariant to rotations as well. The proposed L1-norm optimization technique is intuitive, simple, and easy to implement. It is also proven to find a locally maximal solution. The proposed method is applied to several datasets and the performances are compared with those of other conventional methods.
INDEX TERMS
L1 norm optimization, principal component analysis
CITATION
Nojun Kwak, "Principal Component Analysis Based on L1-Norm Maximization", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.30, no. 9, pp. 1672-1680, September 2008, doi:10.1109/TPAMI.2008.114
REFERENCES
[1] I.T. Jolliffe, Principal Component Analysis. Springer-Verlag, 1986.
[2] F. De la Torre and M.J. Black, “A Framework for Robust Subspace Learning,” Int'l J. Computer Vision, vol. 54, nos. 1-3, pp. 117-142, Aug. 2003.
[3] H. Aanas, R. Fisker, K. Astrom, and J. Carstensen, “Robust Factorization,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 9, pp. 1215-1225, Sept. 2002.
[4] C. Ding, D. Zhou, X. He, and H. Zha, “R1-PCA: Rotational Invariant L1-Norm Principal Component Analysis for Robust Subspace Factorization,” Proc. 23rd Int'l Conf. Machine Learning, June 2006.
[5] A. Baccini, P. Besse, and A.D. Falguerolles, “A L1-Norm PCA and a Heuristic Approach,” Ordinal and Symbolic Data Analysis, E. Diday, Y.Lechevalier, and P. Opitz, eds., pp. 359-368, Springer, 1996.
[6] Q. Ke and T. Kanade, “Robust Subspace Computation Using L1 Norm,” Technical Report CMU-CS-03-172, Carnegie Mellon Univ., http://citeseer. ist.psu.eduke03robust.html , Aug. 2003.
[7] Q. Ke and T. Kanade, “Robust L1 Norm Factorization in the Presence of Outliers and Missing Data by Alternative Convex Programming,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, June 2005.
[8] G. Golub and C.V. Loan, Matrix Computation, third ed. Johns Hopkins Univ. Press, 1996.
[9] D.G. Luenberger, Optimization by Vector Space Methods. Wiley, 1969.
[10] D.J. Newman, S. Hettich, C.L. Blake, and C.J. Merz, “UCI Repository of Machine Learning Databases,” ml-repository@ics.uci.edu, http://www.ics.uci.edu/ mlearnMLRepository.html , 1998.
[11] M. Loog and R.P.W. Duin, “Linear Dimensionality Reduction via a Heteroscedastic Extension of LDA: The Chernoff Criterion,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 26, no. 6, pp. 732-739, June 2004.
[12] H. Xiong, M.N.S. Swamy, and M.O. Ahmad, “Optimizing the Kernel in the Empirical Feature Space,” IEEE Trans. Neural Networks, vol. 16, no. 2, pp.460-474, Mar. 2005.
[13] C.J. Veeman and M.J.T. Reinders, “The Nearest Subclass Classifier: A Compromise between the Nearest Mean and Nearest Neighbor Classifier,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 27, no. 9, pp. 1417-1429, Sept. 2005.
[14] P.N. Belhumeur, J.P. Hespanha, and D.J. Kriegman, “Eigenfaces versus Fisherfaces: Recognition Using Class Specific Linear Projection,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 19, no. 7, pp. 711-720, July 1997.
[15] M. Turk and A. Pentland, “Face Recognition Using Eigenfaces,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, pp. 586-591, 1991.
20 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool