The Community for Technology Leaders
Parallel Architectures, Algorithms and Programming, International Symposium on (2011)
Tianjin, China
Dec. 9, 2011 to Dec. 11, 2011
ISBN: 978-0-7695-4575-2
pp: 105-109
Pattern recognition is one of the most popular topics in the world today. One of its problems is reducing sample variation for the same class and keeping discrimination for different classes. Principal component analysis (PCA) is a multivariate technique that analyzes a data table in which observations are described by several inter-correlated quantitative dependent variables. Its goal is to extract important information from the table to represent it as a set of new orthogonal variables called principal components. Mathematically, PCA depends on the Eigen-decomposition of positive semi-demote matrices and upon the singular value decomposition (SVD) of rectangular matrices. To produce more reliable eigenvalues and hence boost classification accuracy, this project constructs a novel covariance matrix that preserves image locality. The result of this project is significant. The training data can predict testing data accuracy. Compared to with covariance used before, this new covariance may better reflect the relationship between pixels of an image better, hence, classification is better.
principal component analysis, weight, eigen feature extraction

H. Zhang, F. Xia, X. Han and Y. Liu, "Eigen Feature Extraction by Image Locality Preservation," Parallel Architectures, Algorithms and Programming, International Symposium on(PAAP), Tianjin, China, 2011, pp. 105-109.
163 ms
(Ver 3.3 (11022016))