CSDL Home IEEE Transactions on Pattern Analysis & Machine Intelligence 2003 vol.25 Issue No.08 - August
Issue No.08 - August (2003 vol.25)
Juyang Weng , IEEE
Yilu Zhang , IEEE
Wey-Shiuan Hwang , IEEE
<p><b>Abstract</b>—Appearance-based image analysis techniques require fast computation of principal components of high-dimensional image vectors. We introduce a fast incremental principal component analysis (IPCA) algorithm, called candid covariance-free IPCA (CCIPCA), used to compute the principal components of a sequence of samples incrementally without estimating the covariance matrix (so covariance-free). The new method is motivated by the concept of statistical efficiency (the estimate has the smallest variance given the observed data). To do this, it keeps the scale of observations and computes the mean of observations incrementally, which is an efficient estimate for some well-known distributions (e.g., Gaussian), although the highest possible efficiency is not guaranteed in our case because of unknown sample distribution. The method is for real-time applications and, thus, it does not allow iterations. It converges very fast for high-dimensional image vectors. Some links between IPCA and the development of the cerebral cortex are also discussed.</p>
Principal component analysis, incremental principal component analysis, stochastic gradient ascent (SGA), generalized hebbian algorithm (GHA), orthogonal complement.
Juyang Weng, Yilu Zhang, Wey-Shiuan Hwang, "Candid Covariance-Free Incremental Principal Component Analysis", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.25, no. 8, pp. 1034-1040, August 2003, doi:10.1109/TPAMI.2003.1217609