Issue No. 08 - August (2003 vol. 25)

ISSN: 0162-8828

pp: 1034-1040

Juyang Weng , IEEE

Yilu Zhang , IEEE

Wey-Shiuan Hwang , IEEE

ABSTRACT

<p><b>Abstract</b>—Appearance-based image analysis techniques require fast computation of principal components of high-dimensional image vectors. We introduce a fast incremental principal component analysis (IPCA) algorithm, called candid covariance-free IPCA (CCIPCA), used to compute the principal components of a sequence of samples incrementally without estimating the covariance matrix (so covariance-free). The new method is motivated by the concept of statistical efficiency (the estimate has the smallest variance given the observed data). To do this, it keeps the scale of observations and computes the mean of observations incrementally, which is an efficient estimate for some well-known distributions (e.g., Gaussian), although the highest possible efficiency is not guaranteed in our case because of unknown sample distribution. The method is for real-time applications and, thus, it does not allow iterations. It converges very fast for high-dimensional image vectors. Some links between IPCA and the development of the cerebral cortex are also discussed.</p>

INDEX TERMS

Principal component analysis, incremental principal component analysis, stochastic gradient ascent (SGA), generalized hebbian algorithm (GHA), orthogonal complement.

CITATION

W. Hwang, J. Weng and Y. Zhang, "Candid Covariance-Free Incremental Principal Component Analysis," in

*IEEE Transactions on Pattern Analysis & Machine Intelligence*, vol. 25, no. , pp. 1034-1040, 2003.

doi:10.1109/TPAMI.2003.1217609

CITATIONS