The Community for Technology Leaders
Parallel Architectures, Algorithms and Programming, International Symposium on (2011)
Tianjin, China
Dec. 9, 2011 to Dec. 11, 2011
ISBN: 978-0-7695-4575-2
pp: 105-109
ABSTRACT
Pattern recognition is one of the most popular topics in the world today. One of its problems is reducing sample variation for the same class and keeping discrimination for different classes. Principal component analysis (PCA) is a multivariate technique that analyzes a data table in which observations are described by several inter-correlated quantitative dependent variables. Its goal is to extract important information from the table to represent it as a set of new orthogonal variables called principal components. Mathematically, PCA depends on the Eigen-decomposition of positive semi-demote matrices and upon the singular value decomposition (SVD) of rectangular matrices. To produce more reliable eigenvalues and hence boost classification accuracy, this project constructs a novel covariance matrix that preserves image locality. The result of this project is significant. The training data can predict testing data accuracy. Compared to with covariance used before, this new covariance may better reflect the relationship between pixels of an image better, hence, classification is better.
INDEX TERMS
principal component analysis, weight, eigen feature extraction
CITATION
Xiuji Han, Yun Liu, Fei Xia, Hongjie Zhang, "Eigen Feature Extraction by Image Locality Preservation", Parallel Architectures, Algorithms and Programming, International Symposium on, vol. 00, no. , pp. 105-109, 2011, doi:10.1109/PAAP.2011.22
158 ms
(Ver 3.3 (11022016))