Issue No. 09 - Sept. (2012 vol. 34)
P. Honeine , Lab. de Modelisation et Surete des Syst., Univ. de Technol. de Troyes, Troyes, France
Kernel principal component analysis (kernel-PCA) is an elegant nonlinear extension of one of the most used data analysis and dimensionality reduction techniques, the principal component analysis. In this paper, we propose an online algorithm for kernel-PCA. To this end, we examine a kernel-based version of Oja's rule, initially put forward to extract a linear principal axe. As with most kernel-based machines, the model order equals the number of available observations. To provide an online scheme, we propose to control the model order. We discuss theoretical results, such as an upper bound on the error of approximating the principal functions with the reduced-order model. We derive a recursive algorithm to discover the first principal axis, and extend it to multiple axes. Experimental results demonstrate the effectiveness of the proposed approach, both on synthetic data set and on images of handwritten digits, with comparison to classical kernel-PCA and iterative kernel-PCA.
reduced order systems, data analysis, function approximation, principal component analysis, iterative kernel-PCA, online kernel principal component analysis, reduced-order model, data analysis, dimensionality reduction techniques, online algorithm, Oja rule, linear principal axe extraction, kernel-based machines, principal function approximation, synthetic data set, handwritten digit image, classical kernel-PCA, Kernel, Principal component analysis, Eigenvalues and eigenfunctions, Dictionaries, Algorithm design and analysis, Data models, Training data, recursive algorithm., Principal component analysis, online algorithm, machine learning, reproducing kernel, Oja's rule
P. Honeine, "Online Kernel Principal Component Analysis: A Reduced-Order Model," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 34, no. , pp. 1814-1826, 2012.