This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Riemannian Manifold Learning
May 2008 (vol. 30 no. 5)
pp. 796-809
Recently, manifold learning has been widely exploited in pattern recognition, data analysis, and machine learning. This paper presents a novel framework, called Riemannian manifold learning (RML), based on the assumption that the input high-dimensional data lie on an intrinsically low-dimensional Riemannian manifold. The main idea is to formulate the dimensionality reduction problem as a classical problem in Riemannian geometry, i.e., how to construct coordinate charts for a given Riemannian manifold? We implement the Riemannian normal coordinate chart, which has been the most widely used in Riemannian geometry, for a set of unorganized data points. First, two input parameters (the neighborhood size k and the intrinsic dimension d) are estimated based on an efficient simplicial reconstruction of the underlying manifold. Then, the normal coordinates are computed to map the input high-dimensional data into a low-dimensional space. Experiments on synthetic data as well as real world images demonstrate that our algorithm can learn intrinsic geometric structures of the data, preserve radial geodesic distances, and yield regular embeddings.

[1] D.L. Donoho, “High-Dimensional Data Analysis: The Curses and Blessings of Dimensionality,” Proc. AMS Math. Challenges of the 21st Century, 2000.
[2] B. Nadler, S. Lafon, R.R. Coifman, and I.G. Kevrekidis, “Diffusion Maps, Spectral Clustering and Reaction Coordinates of Dynamical Systems,” Applied and Computational Harmonic Analysis, vol. 21, pp.113-127, 2006.
[3] I.T. Jolliffe, Principal Component Analysis. Springer, 1989.
[4] M. Turk and A. Pentland, “Eigenfaces for Recognition,” J.Cognitive Neuroscience, vol. 3, no. 1, pp. 71-86, 1991.
[5] T. Cox and M. Cox, Multidimensional Scaling. Chapman and Hall, 1994.
[6] R.O. Duda, P.E. Hart, and D.G. Stork, Pattern Classification, second ed. John Wiley & Sons, 2001.
[7] T. Kohonen, Self-Organizing Maps, third ed. Springer, 2001.
[8] B. Kégl, A. Krzyzak, T. Linder, and K. Zeger, “Learning and Design of Principal Curves,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 3, pp. 281-297, Mar. 2000.
[9] A.J. Smola, S. Mika, B. Schölkopf, and R.C. Williamson, “Regularized Principal Manifolds,” J. Machine Learning Research, vol. 1, pp. 179-209, June 2001.
[10] P. Baldi and K. Hornik, “Neural Networks and Principal Component Analysis: Learning from Examples without Local Minima,” Neural Networks, vol. 2, no. 1, pp. 53-58, 1989.
[11] C.M. Bishop, M. Svensen, and C.K.I. Williams, “GTM: The Generative Topographic Mapping,” Neural Computation, vol. 10, no. 1, pp. 215-234, 1998.
[12] J. Yang, A.F. Frangi, J.-Y. Yang, D. Zhang, and Z. Jin, “KPCA plus LDA: A Complete Kernel Fisher Discriminant Framework for Feature Extraction and Recognition,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 27, no. 2, pp. 230-244, Feb. 2005.
[13] J. Tenenbaum, V. de Silva, and J. Langford, “A Global Geometric Framework for Nonlinear Dimensionality Reduction,” Science, vol. 290, pp. 2319-2323, Dec. 2000.
[14] V. de Silva and J. Tenenbaum, “Global versus Local Methods in Nonlinear Dimensionality Reduction,” Proc. Advances in Neural Information Processing Systems, vol. 15, pp. 705-712, 2003.
[15] M.H. Law and A.K. Jain, “Incremental Nonlinear Dimensionality Reduction by Manifold Learning,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 28, no. 3, pp. 377-391, Mar. 2006.
[16] S. Roweis and L. Saul, “Nonlinear Dimensionality Reduction by Locally Linear Embedding,” Science, vol. 290, pp. 2323-2326, Dec. 2000.
[17] L.K. Saul and S.T. Roweis, “Think Globally, Fit Locally: Unsupervised Learning of Low Dimensional Manifold,” J.Machine Learning Research, vol. 4, pp. 119-155, 2003.
[18] M. Belkin and P. Niyogi, “Laplacian Eigenmaps for Dimensionality Reduction and Data Representation,” Neural Computation, vol. 15, no. 6, pp. 1373-1396, 2003.
[19] X. He, S. Yan, Y. Hu, P. Niyogi, and H.J. Zhang, “Face Recognition Using Laplacianfaces,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 27, no. 3, pp. 328-340, Mar. 2005.
[20] D. Donoho and C. Grimes, “Hessian Eigenmaps: New Locally Linear Embedding Techniques for High-Dimensional Data,” Proc. Nat'l Academy of Sciences, vol. 100, no. 10, pp. 5591-5596, 2003.
[21] K. Weinberger and L. Saul, “Unsupervised Learning of Image Manifolds by Semidefinite Programming,” Proc. IEEE Int'l Conf. Computer Vision and Pattern Recognition, vol. 2, pp. 988-995, 2004.
[22] M. Brand, “Charting a Manifold,” Proc. Advances in Neural Information Processing Systems, vol. 15, pp. 961-968, 2003.
[23] Z. Zhang and H. Zha, “Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment,” SIAM J. Scientific Computing, vol. 26, no. 1, pp. 313-338, 2005.
[24] R.R. Coifman and S. Lafon, “Diffusion Maps,” Applied and Computational Harmonic Analysis, vol. 21, pp. 5-30, July 2006.
[25] S. Lafon and A.B. Lee, “Diffusion Maps and Coarse-Graining: A Unified Framework for Dimensionality Reduction, Graph Partitioning and Data Set Parameterization,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 28, no. 9, pp. 1393-1403, Sept. 2006.
[26] F. Sha and L.K. Saul, “Analysis and Extension of Spectral Methods for Nonlinear Dimensionality Reduction,” Proc. 22nd Int'l Conf. Machine Learning, pp. 785-792, 2005.
[27] H.S. Seung and D.D. Lee, “The Manifold Ways of Perception,” Science, vol. 290, pp. 2268-2269, Dec. 2000.
[28] B. Riemann, “On the Hypotheses Which Lie at the Bases of Geometry,” Nature, vol. 8, pp. 14-17, pp 36-37 1873, translated by W.K. Clifford.
[29] Y. Bengio, J.F. Paiement, P. Vincent, O. Delalleau, N. Le Roux, and M. Ouimet, “Out-of-Sample Extensions for LLE, ISOMAP, MDS, Eigenmaps, and Spectral Clustering,” Proc. Advances in Neural Information Processing Systems 16, pp. 177-184, 2003.
[30] R.R. Coifman and S. Lafon, “Geometric Harmonics: A Novel Tool for Multiscale Out-of-Sample Extension of Empirical Functions,” Applied and Computational Harmonic Analysis, vol. 21, no. 1, pp. 31-52, July 2006.
[31] C. Fowlkes, S. Belongie, F. Chung, and J. Malik, “Spectral Grouping Using the Nystrom Method,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 26, no. 2, pp. 214-225, Feb. 2004.
[32] A. Brun, C.F. Westin, M. Herberthson, and H. Knutsson, “Fast Manifold Learning Based on Riemannian Normal Coordinates,” Proc. 14th Scandinavian Conf. Image Analysis, pp. 920-929, 2005.
[33] T. Lin, H. Zha, and S. Lee, “Riemannian Manifold Learning for Nonlinear Dimensionality Reduction,” Proc. Ninth European Conf. Computer Vision, pp. 44-55, May 2006.
[34] J.M. Lee, Riemannian Manifolds: An Introduction to Curvature. Springer, 1997.
[35] J. Jost, Riemannian Geometry and Geometric Analysis, third ed. Springer, 2002.
[36] M. Balasubramanian, E.L. Schwartz, J.B. Tenenbaum, V. de Silva, and J.C. Langford, “The Isomap Algorithm and Topological Stability,” Science, vol. 295, p. 7a, Jan. 2002.
[37] J. Wang, Z. Zhang, and H. Zha, “Adaptive Manifold Learning,” Proc. Advances in Neural Information Processing Systems, vol. 17, pp.1473-1480, 2005.
[38] F. Camastra, “Data Dimensionality Estimation Methods: A Survey,” Pattern Recognition, vol. 36, no. 12, pp. 2945-2954, Dec. 2003.
[39] K. Fukunaga and D.R. Olsen, “An Algorithm for Finding Intrinsic Dimensionality of Data,” IEEE Trans. Computers, vol. 20, no. 2, pp.165-171, Feb. 1976.
[40] J. Bruske and G. Sommer, “Intrinsic Dimensionality Estimation with Optimally Topology Preserving Maps,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 20, no. 5, pp. 572-575, May 1998.
[41] G.V. Trunk, “Statistical Estimation of the Intrinsic Dimensionality of a Noisy Signal Collection,” IEEE Trans. Computers, vol. 25, no. 2, pp. 165-171, Feb. 1976.
[42] K. Pettis, T. Bailey, T. Jain, and R. Dubes, “An Intrinsic Dimensionality Estimator from Near-Neighbor Information,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 1, no. 1, pp. 25-37, 1979.
[43] P.J. Verveer and R. Duin, “An Evaluation of Intrinsic Dimensionality Estimators,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 17, no. 1, pp. 81-86, Jan. 1995.
[44] J.A. Costa and A.O. Hero, “Geodesic Entropic Graphs for Dimension and Entropy Estimation in Manifold Learning,” IEEE Trans. Signal Processing, vol. 52, no. 8, pp. 2210-2221, Aug. 2004.
[45] F. Camastra and A. Vinciarelli, “Estimating the Intrinsic Dimension of Data with a Fractal-Based Method,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 10, pp. 1404-1407, Oct. 2002.
[46] D. Freedman, “Efficient Simplicial Reconstructions of Manifolds from Their Samples,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 10, pp. 1349-1357, Oct. 2002.
[47] T. Cormen, C. Leiserson, R. Rivest, and C. Stein, Introduction to Algorithms, second ed. MIT Press, 2001.
[48] http://en.wikipedia.org/wikiDijkstra's_algorithm , July 2005.
[49] D. Eberly, Computing Geodesics on a Riemannian Manifold, http://www.geometrictools.com/Documentation Riemannian Geodesics.pdf, Sept. 2005.
[50] http://isomap.stanford.eduface_data.mat.Z , 2008.
[51] http://www.cs.toronto.edu/~roweis/datafrey_rawface.mat , 2008.
[52] http://www.cs.toronto.edu/~roweis/dataolivettifaces.mat , 2008.
[53] T. Wittman, Mani Matlab Demo, http://www.math.umn.edu/~wittman mani/, 2005.
[54] E. Levina and P. Bickel, “Maximum Likelihood Estimation of Intrinsic Dimension,” Proc. Advances in Neural Information Processing Systems, vol. 17, pp. 777-784, 2005.
[55] M. Raginsky and S. Lazebnik, “Estimation of Intrinsic Dimensionality Using High-Rate Vector Quantization,” Proc. Advances in Neural Information Processing Systems, Dec. 2005.
[56] D.L. Donoho and C. Grimes, “When Does ISOMAP Recover the Natural Parameterization of Families of Articulated Images?” Technical Report 2002-27, Dept. of Statistics, Stanford Univ., 2002.
[57] W.H. Press, B.P. Flannery, S.A. Teukolsky, and W.T. Vetterling, Numerical Recipes in C: The Art of Scientific Computing, second ed. Cambridge Univ. Press, 1992.
[58] M. Atiyah, “Mathematics in the 20th Century,” Bull. London Math. Soc., vol. 34, pp. 1-15, 2002.
[59] G.H. Golub, “Constrained Least Squares and Introduction to Lanczos Method,” Course Notes on Matrix Computation, http://www.mat.uniroma1.it/~bertaccini/seminars/ CS339 syllabus.html, 2004.
[60] D. Eberly, “Distance from Point to a General Quadratic Curve or a General Quadratic Surface,” http://www.geometrictools.com/Documentation DistancePointToQuadratic.pdf, Mar. 1999.

Index Terms:
Dimensionality reduction, manifold learning, manifold reconstruction, Riemannian manifolds, Riemannian normal coordinates.
Citation:
Tong Lin, Hongbin Zha, "Riemannian Manifold Learning," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 30, no. 5, pp. 796-809, May 2008, doi:10.1109/TPAMI.2007.70735
Usage of this product signifies your acceptance of the Terms of Use.