The Community for Technology Leaders
Green Image
Issue No. 10 - October (2011 vol. 33)
ISSN: 0162-8828
pp: 2013-2025
Ming Ji , University of Illinois at Urbana-Champaign, Urbana
Xiaofei He , Zhejiang University, Hangzhou
Hujun Bao , Zhejiang University, Hangzhou
Chiyuan Zhang , Zhejiang University, Hangzhou
ABSTRACT
In many information processing tasks, one is often confronted with very high-dimensional data. Feature selection techniques are designed to find the meaningful feature subset of the original features which can facilitate clustering, classification, and retrieval. In this paper, we consider the feature selection problem in unsupervised learning scenarios, which is particularly difficult due to the absence of class labels that would guide the search for relevant information. Based on Laplacian regularized least squares, which finds a smooth function on the data manifold and minimizes the empirical loss, we propose two novel feature selection algorithms which aim to minimize the expected prediction error of the regularized regression model. Specifically, we select those features such that the size of the parameter covariance matrix of the regularized regression model is minimized. Motivated from experimental design, we use trace and determinant operators to measure the size of the covariance matrix. Efficient computational schemes are also introduced to solve the corresponding optimization problems. Extensive experimental results over various real-life data sets have demonstrated the superiority of the proposed algorithms.
INDEX TERMS
Feature selection, dimensionality reduction, manifold, regularization, regression, clustering.
CITATION
Ming Ji, Xiaofei He, Hujun Bao, Chiyuan Zhang, "A Variance Minimization Criterion to Feature Selection Using Laplacian Regularization", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 33, no. , pp. 2013-2025, October 2011, doi:10.1109/TPAMI.2011.44
98 ms
(Ver 3.1 (10032016))