The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.09 - September (2009 vol.21)
pp: 1285-1298
Feiping Nie , Tsinghua National Laboratory for Information Science and Technology, Beijing
Shiming Xiang , Tsinghua National Laboratory for Information Science and Technology, Beijing
Chunxia Zhang , Beijing Institute of Computer Science, Beijing
ABSTRACT
This paper presents a new algorithm for Nonlinear Dimensionality Reduction (NLDR). Our algorithm is developed under the conceptual framework of compatible mapping. Each such mapping is a compound of a tangent space projection and a group of splines. Tangent space projection is estimated at each data point on the manifold, through which the data point itself and its neighbors are represented in tangent space with local coordinates. Splines are then constructed to guarantee that each of the local coordinates can be mapped to its own single global coordinate with respect to the underlying manifold. Thus, the compatibility between local alignments is ensured. In such a work setting, we develop an optimization framework based on reconstruction error analysis, which can yield a global optimum. The proposed algorithm is also extended to embed out of samples via spline interpolation. Experiments on toy data sets and real-world data sets illustrate the validity of our method.
INDEX TERMS
Nonlinear dimensionality reduction, compatible mapping, local spline embedding, out of samples.
CITATION
Feiping Nie, Shiming Xiang, Chunxia Zhang, "Nonlinear Dimensionality Reduction with Local Spline Embedding", IEEE Transactions on Knowledge & Data Engineering, vol.21, no. 9, pp. 1285-1298, September 2009, doi:10.1109/TKDE.2008.204
REFERENCES
[1] P. Baldi and K. Hornik, “Neural Networks and Principal Component Analysis: Learning from Examples without Local Minima,” Neural Networks, vol. 2, no. 1, pp. 53-58, 1989.
[2] M. Belkin and P. Niyogi, “Laplacian Eigenmaps for Dimensionality Reduction and Data Representation,” Neural Computation, vol. 15, no. 6, pp. 1373-1396, 2003.
[3] Y. Bengio, J.F. Paiement, and P. Vincent, “Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps and Spectral Clustering,” Advances in Neural Information Processing Systems 16, 2004.
[4] C.M. Bishop, M. Svensn, and C.K.I. Williams, “GTM: The Generative Topographic Mapping,” Neural Computation, vol. 10, no. 1, pp. 215-234, 1998.
[5] M. Brand, “Charting a Manifold,” Advances in Neural Information Processing Systems 15, pp. 961-968. MIT Press, 2003.
[6] A. Brun, H.J. Park, H. Kuntsson, and C.F. Westin, “Coloring of DT-MRI Fiber Traces Using Laplacian Eigenmaps,” Proc. Ninth Int'l Conf. Computer Aided Systems Theory (EUROCAST '03), pp.518-529, 2003.
[7] L. Cayton, “Algorithms for Manifold Learning,” Technical Report CS2008-0923, Univ. of California, 2005.
[8] H. Chang, D. Yeung, and Y. Xiong, “Super-Resolution through Neighbor Embedding,” Proc. IEEE CS Conf. Computer Vision and Pattern Recognition (CVPR '04), pp. 275-282, 2004.
[9] T.F. Cox and M. Cox, Multidimensional Scaling. Chapman & Hall, 1994.
[10] V. de Silva and J.B. Tenenbaum, “Global versus Local Methods in Nonlinear Dimensionality Reduction,” Advances in Neural Information Processing Systems 15, pp. 721-728. MIT Press, 2003.
[11] P. Dollar, V. Rabaud, and S. Belongie, “Learning to Traverse Image Manifolds,” Advances in Neural Information Processing Systems 19, pp. 361-368, MIT Press, 2007.
[12] P. Dollar, V. Rabaud, and S. Belongie, “Non-Isometric Manifold Learning: Analysis and an Algorithm,” Proc. 24th Int'l Conf. Machine Learning (ICML '07), pp. 241-248, 2007.
[13] D.L. Donoho and C. Grimes, “Hessian Eigenmaps: Locally Linear Embedding Techniques for High-Dimensional Data,” Proc. Nat'l Academy of Sciences of the USA, vol. 100, no. 10, pp. 5591-5596, 2003.
[14] J. Duchon, “Splines Minimizing Rotation-Invariant Semi-Norms in Sobolev Spaces,” Constructive Theory of Functions of Several Variables, A. Dold and B. Eckmann, eds., pp. 85-100, Springer, 1977.
[15] J. Einbeck, G. Tutz, and L. Evers, “Local Principal Curves,” Statistics and Computing, vol. 15, no. 4, pp. 301-313, 2005.
[16] A.M. Farahmand, C. Szepesvari, and J.-Y. Audibert, “Manifold-Adaptive Dimension Estimation,” Proc. 24th Int'l Conf. Machine Learning (ICML '07), pp. 265-272, 2007.
[17] S. Gerber, T. Tasdizen, and R. Whitaker, “Robust Non-Linear Dimensionality Reduction Using Successive 1-Dimensional Laplacian Eigenmaps,” Proc. 24th Int'l Conf. Machine Learning (ICML '07), pp. 281-288, 2007.
[18] G.H. Golub, C.F. van Loan, Matrix Computations, third ed. JohnsHopkins Univ. Press, 1996.
[19] H. Gong, C. Pan, Q. Yang, H. Lu, and S. Ma, “A Semi-Supervised Framework for Mapping Data to the Intrinsic Manifold,” Proc. 10th IEEE Int'l Conf. Computer Vision (ICCV '05), pp. 98-105, 2005.
[20] A. Gorban, B. Kegl, D. Wunsch, and A. Zinovyev, Principal Manifolds for Data Visualization and Dimension Reduction. Springer, 2007.
[21] A. Gorban and A. Zinovyev, “Elastic Principal Graphs and Manifolds and Their Practical Applications,” Computing, vol. 79, pp. 359-379, 2005.
[22] J. Ham, D.D. Lee, S. Mika, and B. Schokopf, “A Kernel View of the Dimensionality Reduction of Manifolds,” Proc. 21st Int'l Conf. Machine Learning (ICML '04), pp. 369-376, 2004.
[23] J. Ham, D.D. Lee, and L.K. Saul, “Semisupervised Alignment of Manifolds,” Proc. Int'l Workshop Artificial Intelligence and Statistics (AISTATS '04), pp. 120-127, 2004.
[24] T. Hastie and W. Stuetzle, “Principal Curves,” J. Am. Statistical Assoc., vol. 84, no. 406, pp. 502-516, 1989.
[25] M. Hein and M. Maier, “Manifold Denoising,” Advances in Neural Information Processing Systems 19, pp. 1-8. MIT Press, 2007.
[26] O.C. Jenkins and M.J. Mataric, “A Spatio-Temporal Extension to Isomap Nonlinear Dimension Reduction,” Proc. 21st Int'l Conf. Machine Learning (ICML '04), pp. 441-448, 2004.
[27] I.T. Jolliffe, Principal Component Analysis. Springer, 1986.
[28] B. Kegl, A. Krzyzak, T. Linder, and K. Zeger, “Learning and Design of Principal Curves,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 3, pp. 281-297, Mar. 2000.
[29] T. Kohonen, Self-Organization Maps, third ed. Springer, 2001.
[30] D. MacKay, “Introduction to Gaussian Processes,” technical report, Univ. of Cambridge, http://www.inference.phy.cam.ac.uk/ mackay/ abstractsgpB.html., 1997.
[31] J. Mao and A.K. Jain, “Artificial Neural Networks for Feature Extraction and Multivariate Data Projection,” IEEE Trans. Neural Networks, vol. 16, no. 2, pp. 296-317, 1995.
[32] J. Meinguet, “Multivariate Interpolation at Arbitrary Points Made Simple,” J. Applied Math. and Physics, vol. 30, 1979.
[33] C.A. Micchelli, “Interpolation of Scattered Data: Distance Matrices and Conditionally Positive Functions,” Constructive Approximation, vol. 2, pp. 11-22, 1986.
[34] W. Min, K. Lu, and X. He, “Locality Pursuit Embedding,” Pattern Recognition, vol. 37, no. 4, pp. 781-788, 2004.
[35] S. Mosci, L. Rosasco, and A. Verri, “Dimensionality Reduction and Generalization,” Proc. 24th Int'l Conf. Machine Learning (ICML '07), pp. 657-664, 2007.
[36] A.W. Naylor and G.R. Sell, Linear Operator Theory in Engineering and Science. Springer, 1982.
[37] J. Nilsson, F. Sha, and M.I. Jordan, “Regression on Manifolds Using Kernel Dimension Reduction,” Proc. 24th Int'l Conf. Machine Learning (ICML '07), pp. 697-704, 2007.
[38] M. Polito and P. Perona, “Grouping and Dimensionality Reduction by Locally Linear Embedding,” Advances in Neural Information Processing Systems 14, pp. 1255-1262. MIT Press, 2002.
[39] S. Roweis and L. Saul, “Nonlinear Dimensionality Reduction by Locally Linear Embedding,” Science, vol. 290, pp. 2323-2326, 2000.
[40] J.W. Sammom, “A Nonlinear Mapping for Data Structure Analysis,” IEEE Trans. Computers, vol. 18, no. 5, pp. 401-409, May 1969.
[41] L.K. Saul and S.T. Roweis, “Thinking Globally, Fit Locally: Unsupervised Learning of Low Dimensional Manifolds,” J.Machine Learning Research, vol. 4, pp. 119-155, 2003.
[42] B. Schölkopf, A.J. Smola, and K.R. Muller, “Nonlinear Component Analysis as a Kernel Eigenvalue Problem,” Neural Computation, vol. 10, no. 5, pp. 1299-1319, 1998.
[43] B. Schölkopf and A.J. Smola, Learning with Kernels. MIT Press, 2002.
[44] G. Seber, Multivariate Observations. John Wiley & Sons, 1984.
[45] F. Sha and L.K. Saul, “Analysis and Extension of Spectral Methods for Nonlinear Dimensionality Reduction,” Proc. 22nd Int'l Conf. Machine Learning (ICML '05), pp. 784-791, 2005.
[46] A.J. Smola, S. Mika, B. Schölkop, and R.C. Williamson, “Regularized Principal Manifolds,” J. Machine Learning, vol. 1, no. 3, pp.179-209, 2001.
[47] Y.W. Teh and S. Roweis, “Automatic Alignment of Local Representations,” Advances in Neural Information Processing Systems 15, pp. 841-848. MIT Press, 2003.
[48] J.B. Tenenbaum, V. de Silva, and J.C. Langford, “A Global Geometric Framework for Nonlinear Dimensionality Reduction,” Science, vol. 290, pp. 2319-2323, 2000.
[49] R. Tibshirani, “Principal Curves Revisited,” Statistics and Computing, vol. 2, pp. 183-190, 1992.
[50] M. Vlachos, C. Domeniconi, and D. Gunopulos, “Non-Linear Dimensionality Reduction Techniques for Classification and Visualization,” Proc. ACM SIGKDD '02, pp. 645-651, 2002.
[51] G. Wahba, Spline Models for Observational Data. SIAM Press, 1990.
[52] F. Wang and C. Zhang, “Label Propagation through Linear Neighborhoods,” Proc. 23rd Int'l Conf. Machine Learning (ICML'06), pp. 985-992, 2006.
[53] K.Q. Weinberger and L.K. Saul, “Unsupervised Learning of Image Manifolds by Semidefinite Programming,” Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR '04), pp. 988-995, 2004.
[54] K.Q. Weinberger, F. Sha, and L.K. Saul, “Learning a Kernel Matrix for Nonlinear Dimensionality Reduction,” Proc. 21st Int'l Conf. Machine Learning (ICML '04), pp. 888-905, 2004.
[55] S.M. Xiang, F.P. Nie, C.S. Zhang, and C.X. Zhang, “Spline Embedding for Nonlinear Dimensionality Reduction,” Proc. 17th European Conf. Machine Learning (ECML '06), pp. 825-832, 2006.
[56] S. Yan, D. Xu, B. Zhang, and H. Zhang, “Graph Embedding: A General Framework for Dimensionality Reduction,” Proc. IEEE CSConf. Computer Vision and Pattern Recognition (CVPR '05), pp.830-837, 2005.
[57] X. Yang, H. Fu, H. Zha, and J. Barlow, “Semi-Supervised Nonlinear Dimensionality Reduction,” Proc. 23rd Int'l Conf. Machine Learning (ICML '06), pp. 1065-1072, 2006.
[58] J. Yoon, “Spectral Approximation Orders of Radial Basis Function Interpolation on the Sobolev Space,” SIAM J. Math. Analysis, vol. 33, no. 4, pp. 946-958, 2001.
[59] H. Zha and Z. Zhang, “Isometric Embedding and Continuum Isomap,” Proc. 20th Int'l Conf. Machine Learning (ICML '03), pp.864-871, 2003.
[60] Z. Zhang and J. Wang, “MLLE: Modified Locally Linear Embedding Using Multiple Weights,” Advances in Neural Information Processing Systems 19, pp. 1593-1600. MIT Press, 2007.
[61] Z. Zhang and H. Zha, “Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment,” SIAM J. Scientific Computing, vol. 26, no. 1, pp. 313-338, 2004.
[62] D. Zhou, J. Weston, A. Gretton, O. Bousquet, and B. Schölkopf, “Ranking on Data Manifolds,” Advances in Neural Information Processing Systems 15, 2003.
16 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool