The Community for Technology Leaders
Green Image
Issue No. 11 - November (2009 vol. 21)
ISSN: 1041-4347
pp: 1590-1603
Yi-Ren Yeh , National Taiwan University of Science and Technology, Taipei
Su-Yun Huang , Academia Sinica, Taipei
Yuh-Jye Lee , National Taiwan University of Science and Technology, Taipei
Sliced inverse regression (SIR) is a renowned dimension reduction method for finding an effective low-dimensional linear subspace. Like many other linear methods, SIR can be extended to nonlinear setting via the “kernel trick.” The main purpose of this paper is two-fold. We build kernel SIR in a reproducing kernel Hilbert space rigorously for a more intuitive model explanation and theoretical development. The second focus is on the implementation algorithm of kernel SIR for fast computation and numerical stability. We adopt a low-rank approximation to approximate the huge and dense full kernel covariance matrix and a reduced singular value decomposition technique for extracting kernel SIR directions. We also explore kernel SIR's ability to combine with other linear learning algorithms for classification and regression including multiresponse regression. Numerical experiments show that kernel SIR is an effective kernel tool for nonlinear dimension reduction and it can easily combine with other linear algorithms to form a powerful toolkit for nonlinear data analysis.
Dimension reduction, eigenvalue decomposition, kernel, reproducing kernel Hilbert space, singular value decomposition, sliced inverse regression, support vector machines.
Yi-Ren Yeh, Su-Yun Huang, Yuh-Jye Lee, "Nonlinear Dimension Reduction with Kernel Sliced Inverse Regression", IEEE Transactions on Knowledge & Data Engineering, vol. 21, no. , pp. 1590-1603, November 2009, doi:10.1109/TKDE.2008.232
108 ms
(Ver 3.1 (10032016))