The Community for Technology Leaders
Subscribe
Issue No.03 - March (2008 vol.30)
pp: 438-450
ABSTRACT
Data observations that lie on a manifold can be approximated by a collection of overlapping local patches, the alignment of which in a low dimensional Euclidean space provides an embedding of the data. This paper describes an embedding method using classical multidimensional scaling as a local model based on the fact that a manifold locally resembles an Euclidean space. A set of overlapping neighborhoods are chosen by a greedy approximation algorithm of minimum set cover. Local patches derived from the set of overlapping neighborhoods by classical multidimensional scaling are aligned in order to minimize a residual measure, which has a quadratic form of the resulting global coordinates and can be minimized analytically by solving an eigenvalue problem. This method requires only distances within each neighborhood and provides locally isometric embedding results. The size of the eigenvalue problem scales with the number of overlapping neighborhoods rather than the number of data points. Experiments on both synthetic and real world data sets demonstrate the effectiveness of this method. Extensions and variations of the method are discussed.
CITATION
Li Yang, "Alignment of Overlapping Locally Scaled Patches for Multidimensional Scaling and Dimensionality Reduction", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.30, no. 3, pp. 438-450, March 2008, doi:10.1109/TPAMI.2007.70706
REFERENCES
 [1] M. Belkin and P. Niyogi, “Laplacian Eigenmaps for Dimensionality Reduction and Data Representation,” Neural Computation, vol. 15, no. 6, pp. 1373-1396, June 2003. [2] Y. Bengio, J.-F. Paiement, P. Vincent, O. Delalleau, N.L. Roux, and M. Ouimet, “Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering,” Proc. Advances Neural Information Processing Systems 16, Dec. 2003. [3] R.S. Bennet, “The Intrinsic Dimensionality of Signal Collections,” IEEE Trans. Information Theory, vol. IT-15, no. 5, pp. 517-525, Sept. 1969. [4] T.H. Cormen, C.E. Leiserson, R.L. Rivest, and C. Stein, Introduction to Algorithms, second ed. MIT Press, 2001. [5] T.F. Cox and M.A.A. Cox, Multidimensional Scaling, second ed. Chapman & Hall, 2001. [6] D.L. Donoho and C. Grimes, “Hessian Eigenmaps: Locally Linear Embedding Techniques for High-Dimensional Data,” Proc. Nat'l Academy of Sciences, vol. 100, no. 10, pp. 5591-5596, May 2003. [7] J. Edmonds and R.M. Karp, “Theoretical Improvements in Algorithmic Efficiency for Network Flow Problems,” J. ACM, vol. 19, no. 2, pp. 248-264, Apr. 1972. [8] S. Even and R.E. Tarjan, “Network Flow and Testing Graph Connectivity,” SIAM J. Computing, vol. 4, no. 4, pp. 507-518, Dec. 1975. [9] U. Feige, “A Threshold of $\ln n$ for Approximating Set Cover,” J.ACM, vol. 45, no. 4, pp. 634-652, July 1998. [10] L.R. Ford, Jr. and D.R. Fulkerson, Flows in Networks. Princeton Univ. Press, 1962. [11] K. Fukunaga and D.R. Olsen, “An Algorithm for Finding Intrinsic Dimensionality of Data,” IEEE Trans. Computers, vol. 20, no. 2, pp.176-183, Feb. 1971. [12] M.R. Garey and D.S. Johnson, Computers and Intractability: A Guide to the Theory of NP-Completeness. W.H. Freeman, 1979. [13] R.A. Horn and C.R. Johnson, Matrix Analysis. Cambridge Univ. Press, 1985. [14] R.M. Karp, “Reducibility among Combinatorial Problems,” Complexity of Computer Computations, R.E. Miller and J.W. Thatcher, eds. Plenum Press, pp. 85-103, 1972. [15] B. Kegl, “Intrinsic Dimension Estimation Using Packing Numbers,” Proc. Advances Neural Information Processing Systems 15, Dec. 2002. [16] J. Kruskal, “Multidimensional Scaling by Optimizing Goodness-of-Fit to a Nonmetric Hypothesis,” Psychometrika, vol. 29, pp. 1-27, 1964. [17] C. Lund and M. Yannakakis, “On the Hardness of Approximating Minimization Problems,” J. ACM, vol. 41, no. 5, pp. 960-981, Sept. 1994. [18] H. Niemann and J. Weiss, “A Fast Converging Algorithm for Nonlinear Mapping of High-Dimensional Data onto a Plane,” IEEE Trans. Computers, vol. 28, no. 2, pp. 142-147, Feb. 1979. [19] K.W. Pettis, T.A. Bailey, A.K. Jain, and R.C. Dubes, “An Intrinsic Dimensionality Estimator from Near-Neighbor Information,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 1, no. 1, pp. 25-37, Jan. 1979. [20] S.T. Roweis and L.K. Saul, “Nonlinear Dimensionality Reduction by Locally Linear Embedding,” Science, vol. 290, pp. 2323-2326, Dec. 2000. [21] J.W. SammonJr., “A Nonlinear Mapping for Data Structure Analysis,” IEEE Trans. Computers, vol. 18, no. 5, pp. 401-409, May 1969. [22] L.K. Saul and S.T. Roweis, “Think Globally, Fit Locally: Unsupervised Learning of Low Dimensional Manifolds,” J.Machine Learning Research, vol. 4, pp. 119-155, June 2003. [23] H.S. Seung and D. Lee, “The Manifold Ways of Perception,” Science, vol. 290, pp. 2268-2269, Dec. 2000. [24] Y.W. Teh and S.T. Roweis, “Automatic Alignment of Hidden Representations,” Proc. Advances Neural Information Processing Systems 15, Dec. 2002. [25] J.B. Tenenbaum, V. de Silva, and J.C. Langford, “A Global Geometric Framework for Nonlinear Dimensionality Reduction,” Science, vol. 290, pp. 2319-2323, Dec. 2000. [26] G.V. Trunk, “Statistical Estimation of the Intrinsic Dimensionality of a Noisy Signal Collection,” IEEE Trans. Computers, vol. 25, no. 2, pp. 165-171, Feb. 1976. [27] J. Verbeek, S. Roweis, and N. Vlassis, “Non-Linear CCA and PCA by Alignment of Local Models,” Proc. Neural Information Processing Systems 16, Dec. 2003. [28] P.J. Verveer and R.P.W. Duin, “An Evaluation of Intrinsic Dimensionality Estimators,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 17, no. 1, pp. 81-86, Jan. 1995. [29] L. Yang, “Building $k$ -Edge-Connected Neighborhood Graphs for Distance-Based Data Projection,” Pattern Recognition Letters, vol. 26, no. 13, pp. 2015-2021, Oct. 2005. [30] L. Yang, “Building $k$ Edge-Disjoint Spanning Trees of Minimum Total Length for Isometric Data Embedding,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 27, no. 10, pp. 1680-1683, Oct. 2005. [31] L. Yang, “Building $k$ -Connected Neighborhood Graphs for Isometric Data Embedding,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 28, no. 5, pp. 827-831, May 2006. [32] L. Yang, “Locally Multidimensional Scaling for Nonlinear Dimensionality Reduction,” Proc. 18th Int'l. Conf. Pattern Recognition, vol. 4, pp. 202-205, Aug. 2006.
FULL ARTICLE
SEARCH
7 ms
(Ver 2.0)

Marketing Automation Platform