The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.09 - September (2006 vol.28)
pp: 1393-1403
ABSTRACT
We provide evidence that nonlinear dimensionality reduction, clustering, and data set parameterization can be solved within one and the same framework. The main idea is to define a system of coordinates with an explicit metric that reflects the connectivity of a given data set and that is robust to noise. Our construction, which is based on a Markov random walk on the data, offers a general scheme of simultaneously reorganizing and subsampling graphs and arbitrarily shaped data sets in high dimensions using intrinsic geometry. We show that clustering in embedding spaces is equivalent to compressing operators. The objective of data partitioning and clustering is to coarse-grain the random walk on the data while at the same time preserving a diffusion operator for the intrinsic geometry or connectivity of the data set up to some accuracy. We show that the quantization distortion in diffusion space bounds the error of compression of the operator, thus giving a rigorous justification for k-means clustering in diffusion space and a precise measure of the performance of general clustering algorithms.
INDEX TERMS
Machine learning, text analysis, knowledge retrieval, quantization, graph-theoretic methods, compression (coding), clustering, clustering similarity measures, information visualization, Markov processes, graph algorithms.
CITATION
St?phane Lafon, Ann B. Lee, "Diffusion Maps and Coarse-Graining: A Unified Framework for Dimensionality Reduction, Graph Partitioning, and Data Set Parameterization", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.28, no. 9, pp. 1393-1403, September 2006, doi:10.1109/TPAMI.2006.184
19 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool