CSDL Home IEEE Transactions on Pattern Analysis & Machine Intelligence 2011 vol.33 Issue No.10 - October

Subscribe

Issue No.10 - October (2011 vol.33)

pp: 2013-2025

Xiaofei He , Zhejiang University, Hangzhou

Chiyuan Zhang , Zhejiang University, Hangzhou

Hujun Bao , Zhejiang University, Hangzhou

DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TPAMI.2011.44

ABSTRACT

In many information processing tasks, one is often confronted with very high-dimensional data. Feature selection techniques are designed to find the meaningful feature subset of the original features which can facilitate clustering, classification, and retrieval. In this paper, we consider the feature selection problem in unsupervised learning scenarios, which is particularly difficult due to the absence of class labels that would guide the search for relevant information. Based on Laplacian regularized least squares, which finds a smooth function on the data manifold and minimizes the empirical loss, we propose two novel feature selection algorithms which aim to minimize the expected prediction error of the regularized regression model. Specifically, we select those features such that the size of the parameter covariance matrix of the regularized regression model is minimized. Motivated from experimental design, we use trace and determinant operators to measure the size of the covariance matrix. Efficient computational schemes are also introduced to solve the corresponding optimization problems. Extensive experimental results over various real-life data sets have demonstrated the superiority of the proposed algorithms.

INDEX TERMS

Feature selection, dimensionality reduction, manifold, regularization, regression, clustering.

CITATION

Xiaofei He, Chiyuan Zhang, Hujun Bao, "A Variance Minimization Criterion to Feature Selection Using Laplacian Regularization",

*IEEE Transactions on Pattern Analysis & Machine Intelligence*, vol.33, no. 10, pp. 2013-2025, October 2011, doi:10.1109/TPAMI.2011.44REFERENCES