The Community for Technology Leaders
2013 IEEE Conference on Computer Vision and Pattern Recognition (2006)
New York, NY
June 17, 2006 to June 22, 2006
ISSN: 1063-6919
ISBN: 0-7695-2597-0
pp: 109-116
Yuichi Motai , University of Vermont, Burlington,VT
Xingquan Zhu , University of Vermont, Burlington,VT
Robert R. Snapp , University of Vermont, Burlington,VT
Xianhua Jiang , University of Vermont, Burlington,VT
A fast algorithm, Accelerated Kernel Feature Analysis (AKFA), that discovers salient features evidenced in a sample of n unclassified patterns, is presented. Like earlier kernel-based feature selection algorithms, AKFA implicitly embeds each pattern into a Hilbert space, H, induced by a Mercer kernel. An \ell-dimensional linear subspace of H is iteratively constructed by maximizing a variance condition for the nonlinearly transformed sample. This linear subspace can then be used to define more efficient data representations and pattern classifiers. AKFA requires O(\elln2) operations, as compared to 0(n^3) for Sch?olkof, Smola, and M?uller?s Kernel Principal Component Analysis (KPCA), and O(\ell^2 n^2) for Smola, Mangasarian, and Sch?olkopf?s Sparse Kernel Feature Analysis (SKFA). Numerical experiments show that AKFA can generate more concise feature representations than both KPCA and SKFA, and demonstrate that AKFA obtains similar classification performance as KPCA for a face recognition problem.
Yuichi Motai, Xingquan Zhu, Robert R. Snapp, Xianhua Jiang, "Accelerated Kernel Feature Analysis", 2013 IEEE Conference on Computer Vision and Pattern Recognition, vol. 01, no. , pp. 109-116, 2006, doi:10.1109/CVPR.2006.43
90 ms
(Ver )