Subscribe

Issue No.09 - September (2009 vol.21)

pp: 1285-1298

Shiming Xiang , Tsinghua National Laboratory for Information Science and Technology, Beijing

Feiping Nie , Tsinghua National Laboratory for Information Science and Technology, Beijing

Changshui Zhang , Tsinghua National Laboratory for Information Science and Technology, Beijing

Chunxia Zhang , Beijing Institute of Computer Science, Beijing

DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TKDE.2008.204

ABSTRACT

This paper presents a new algorithm for Nonlinear Dimensionality Reduction (NLDR). Our algorithm is developed under the conceptual framework of compatible mapping. Each such mapping is a compound of a tangent space projection and a group of splines. Tangent space projection is estimated at each data point on the manifold, through which the data point itself and its neighbors are represented in tangent space with local coordinates. Splines are then constructed to guarantee that each of the local coordinates can be mapped to its own single global coordinate with respect to the underlying manifold. Thus, the compatibility between local alignments is ensured. In such a work setting, we develop an optimization framework based on reconstruction error analysis, which can yield a global optimum. The proposed algorithm is also extended to embed out of samples via spline interpolation. Experiments on toy data sets and real-world data sets illustrate the validity of our method.

INDEX TERMS

Nonlinear dimensionality reduction, compatible mapping, local spline embedding, out of samples.

CITATION

Shiming Xiang, Feiping Nie, Changshui Zhang, Chunxia Zhang, "Nonlinear Dimensionality Reduction with Local Spline Embedding",

*IEEE Transactions on Knowledge & Data Engineering*, vol.21, no. 9, pp. 1285-1298, September 2009, doi:10.1109/TKDE.2008.204REFERENCES

- [3] Y. Bengio, J.F. Paiement, and P. Vincent, “Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps and Spectral Clustering,”
Advances in Neural Information Processing Systems 16, 2004.- [5] M. Brand, “Charting a Manifold,”
Advances in Neural Information Processing Systems 15, pp. 961-968. MIT Press, 2003.- [6] A. Brun, H.J. Park, H. Kuntsson, and C.F. Westin, “Coloring of DT-MRI Fiber Traces Using Laplacian Eigenmaps,”
Proc. Ninth Int'l Conf. Computer Aided Systems Theory (EUROCAST '03), pp.518-529, 2003.- [7] L. Cayton, “Algorithms for Manifold Learning,” Technical Report CS2008-0923, Univ. of California, 2005.
- [8] H. Chang, D. Yeung, and Y. Xiong, “Super-Resolution through Neighbor Embedding,”
Proc. IEEE CS Conf. Computer Vision and Pattern Recognition (CVPR '04), pp. 275-282, 2004.- [9] T.F. Cox and M. Cox,
Multidimensional Scaling. Chapman & Hall, 1994.- [10] V. de Silva and J.B. Tenenbaum, “Global versus Local Methods in Nonlinear Dimensionality Reduction,”
Advances in Neural Information Processing Systems 15, pp. 721-728. MIT Press, 2003.- [11] P. Dollar, V. Rabaud, and S. Belongie, “Learning to Traverse Image Manifolds,”
Advances in Neural Information Processing Systems 19, pp. 361-368, MIT Press, 2007.- [12] P. Dollar, V. Rabaud, and S. Belongie, “Non-Isometric Manifold Learning: Analysis and an Algorithm,”
Proc. 24th Int'l Conf. Machine Learning (ICML '07), pp. 241-248, 2007.- [14] J. Duchon, “Splines Minimizing Rotation-Invariant Semi-Norms in Sobolev Spaces,”
Constructive Theory of Functions of Several Variables, A. Dold and B. Eckmann, eds., pp. 85-100, Springer, 1977.- [18] G.H. Golub, C.F. van Loan,
Matrix Computations, third ed. JohnsHopkins Univ. Press, 1996.- [20] A. Gorban, B. Kegl, D. Wunsch, and A. Zinovyev,
Principal Manifolds for Data Visualization and Dimension Reduction. Springer, 2007.- [22] J. Ham, D.D. Lee, S. Mika, and B. Schokopf, “A Kernel View of the Dimensionality Reduction of Manifolds,”
Proc. 21st Int'l Conf. Machine Learning (ICML '04), pp. 369-376, 2004.- [23] J. Ham, D.D. Lee, and L.K. Saul, “Semisupervised Alignment of Manifolds,”
Proc. Int'l Workshop Artificial Intelligence and Statistics (AISTATS '04), pp. 120-127, 2004.- [24] T. Hastie and W. Stuetzle, “Principal Curves,”
J. Am. Statistical Assoc., vol. 84, no. 406, pp. 502-516, 1989.- [25] M. Hein and M. Maier, “Manifold Denoising,”
Advances in Neural Information Processing Systems 19, pp. 1-8. MIT Press, 2007.- [26] O.C. Jenkins and M.J. Mataric, “A Spatio-Temporal Extension to Isomap Nonlinear Dimension Reduction,”
Proc. 21st Int'l Conf. Machine Learning (ICML '04), pp. 441-448, 2004.- [27] I.T. Jolliffe,
Principal Component Analysis. Springer, 1986.- [29] T. Kohonen,
Self-Organization Maps, third ed. Springer, 2001.- [30] D. MacKay, “Introduction to Gaussian Processes,” technical report, Univ. of Cambridge, http://www.inference.phy.cam.ac.uk/ mackay/ abstractsgpB.html., 1997.
- [35] S. Mosci, L. Rosasco, and A. Verri, “Dimensionality Reduction and Generalization,”
Proc. 24th Int'l Conf. Machine Learning (ICML '07), pp. 657-664, 2007.- [36] A.W. Naylor and G.R. Sell,
Linear Operator Theory in Engineering and Science. Springer, 1982.- [37] J. Nilsson, F. Sha, and M.I. Jordan, “Regression on Manifolds Using Kernel Dimension Reduction,”
Proc. 24th Int'l Conf. Machine Learning (ICML '07), pp. 697-704, 2007.- [38] M. Polito and P. Perona, “Grouping and Dimensionality Reduction by Locally Linear Embedding,”
Advances in Neural Information Processing Systems 14, pp. 1255-1262. MIT Press, 2002.- [41] L.K. Saul and S.T. Roweis, “Thinking Globally, Fit Locally: Unsupervised Learning of Low Dimensional Manifolds,”
J.Machine Learning Research, vol. 4, pp. 119-155, 2003.- [43] B. Schölkopf and A.J. Smola,
Learning with Kernels. MIT Press, 2002.- [44] G. Seber,
Multivariate Observations. John Wiley & Sons, 1984.- [45] F. Sha and L.K. Saul, “Analysis and Extension of Spectral Methods for Nonlinear Dimensionality Reduction,”
Proc. 22nd Int'l Conf. Machine Learning (ICML '05), pp. 784-791, 2005.- [47] Y.W. Teh and S. Roweis, “Automatic Alignment of Local Representations,”
Advances in Neural Information Processing Systems 15, pp. 841-848. MIT Press, 2003.- [49] R. Tibshirani, “Principal Curves Revisited,”
Statistics and Computing, vol. 2, pp. 183-190, 1992.- [51] G. Wahba,
Spline Models for Observational Data. SIAM Press, 1990.- [53] K.Q. Weinberger and L.K. Saul, “Unsupervised Learning of Image Manifolds by Semidefinite Programming,”
Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR '04), pp. 988-995, 2004.- [54] K.Q. Weinberger, F. Sha, and L.K. Saul, “Learning a Kernel Matrix for Nonlinear Dimensionality Reduction,”
Proc. 21st Int'l Conf. Machine Learning (ICML '04), pp. 888-905, 2004.- [55] S.M. Xiang, F.P. Nie, C.S. Zhang, and C.X. Zhang, “Spline Embedding for Nonlinear Dimensionality Reduction,”
Proc. 17th European Conf. Machine Learning (ECML '06), pp. 825-832, 2006.- [56] S. Yan, D. Xu, B. Zhang, and H. Zhang, “Graph Embedding: A General Framework for Dimensionality Reduction,”
Proc. IEEE CSConf. Computer Vision and Pattern Recognition (CVPR '05), pp.830-837, 2005.- [59] H. Zha and Z. Zhang, “Isometric Embedding and Continuum Isomap,”
Proc. 20th Int'l Conf. Machine Learning (ICML '03), pp.864-871, 2003.- [60] Z. Zhang and J. Wang, “MLLE: Modified Locally Linear Embedding Using Multiple Weights,”
Advances in Neural Information Processing Systems 19, pp. 1593-1600. MIT Press, 2007.- [61] Z. Zhang and H. Zha, “Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment,”
SIAM J. Scientific Computing, vol. 26, no. 1, pp. 313-338, 2004.- [62] D. Zhou, J. Weston, A. Gretton, O. Bousquet, and B. Schölkopf, “Ranking on Data Manifolds,”
Advances in Neural Information Processing Systems 15, 2003. |