CSDL Home IEEE Transactions on Pattern Analysis & Machine Intelligence 2011 vol.33 Issue No.08 - August
Issue No.08 - August (2011 vol.33)
Hong Zeng , Southeast University, Nanjing and Hong Kong Baptist University, Hong Kong
The performance of the most clustering algorithms highly relies on the representation of data in the input space or the Hilbert space of kernel methods. This paper is to obtain an appropriate data representation through feature selection or kernel learning within the framework of the Local Learning-Based Clustering (LLC) (Wu and Schölkopf 2006) method, which can outperform the global learning-based ones when dealing with the high-dimensional data lying on manifold. Specifically, we associate a weight to each feature or kernel and incorporate it into the built-in regularization of the LLC algorithm to take into account the relevance of each feature or kernel for the clustering. Accordingly, the weights are estimated iteratively in the clustering process. We show that the resulting weighted regularization with an additional constraint on the weights is equivalent to a known sparse-promoting penalty. Hence, the weights of those irrelevant features or kernels can be shrunk toward zero. Extensive experiments show the efficacy of the proposed methods on the benchmark data sets.
High-dimensional data, local learning-based clustering, feature selection, kernel learning, sparse weighting.
Hong Zeng, "Feature Selection and Kernel Learning for Local Learning-Based Clustering", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.33, no. 8, pp. 1532-1547, August 2011, doi:10.1109/TPAMI.2010.215