This Article 
 Bibliographic References 
 Add to: 
Ensemble Manifold Regularization
June 2012 (vol. 34 no. 6)
pp. 1227-1233
Chao Xu, Key Lab. of Machine Perception (Minist. of Educ.), Peking Univ., Beijing, China
Dacheng Tao, Centre for Quantum Comput. & Intell. Syst., Univ. of Technol., Sydney, NSW, Australia
Bo Geng, Key Lab. of Machine Perception (Minist. of Educ.), Peking Univ., Beijing, China
Linjun Yang, Microsoft Res. Asia, Beijing, China
Xian-Sheng Hua, Microsoft, Redmond, WA, USA
We propose an automatic approximation of the intrinsic manifold for general semi-supervised learning (SSL) problems. Unfortunately, it is not trivial to define an optimization function to obtain optimal hyperparameters. Usually, cross validation is applied, but it does not necessarily scale up. Other problems derive from the suboptimality incurred by discrete grid search and the overfitting. Therefore, we develop an ensemble manifold regularization (EMR) framework to approximate the intrinsic manifold by combining several initial guesses. Algorithmically, we designed EMR carefully so it 1) learns both the composite manifold and the semi-supervised learner jointly, 2) is fully automatic for learning the intrinsic manifold hyperparameters implicitly, 3) is conditionally optimal for intrinsic manifold approximation under a mild and reasonable assumption, and 4) is scalable for a large number of candidate manifold hyperparameters, from both time and space perspectives. Furthermore, we prove the convergence property of EMR to the deterministic matrix at rate root-n. Extensive experiments over both synthetic and real data sets demonstrate the effectiveness of the proposed framework.

[1] sets krbd/, 2012.
[2] manifold.html, 2012.
[3] http://www.ics.uci.edumlearn/, 2012.
[4] A. Argyriou, M. Herbster, and M . Pontil, "Combining Graph Laplacians for Semi-Supervised Learning," Proc. Advances in Neural Information Processing Systems 18, pp. 67-74, 2005.
[5] M. Belkin and P. Niyogi, "Using Manifold Structure for Partially Labelled Classification," Proc. Neural Information Processing System Conf., 2002.
[6] M. Belkin and P. Niyogi, "Laplacian Eigenmaps for Dimensionality Reduction and Data Representation," Neural Computation, vol. 15, pp. 1373-1396, 2003.
[7] M. Belkin, P. Niyogi, and V. Sindhwani, "Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples," J. Machine Learning Research, vol. 7, pp. 2399-2434, 2006.
[8] K.P. Bennett and A. Demiriz, "Semi-Supervised Support Vector Machines," Advances in Neural Information Processing Systems, vol. 12, pp. 368-374, 1998.
[9] J.C. Bezdek and R.J. Hathaway, "Convergence of Alternating Optimization," Neural, Parallel and Scientific Computations, vol. 11, pp. 351-368, 2003.
[10] A. Blum and T. Mitchell, "Combining Labeled and Unlabeled Data with Co-Training," Proc. 11th Ann. Conf. Computational Learning Theory, 1998.
[11] S. Boyd and L. Vandenberghe, Convex Optimization. Cambridge Univ., 2004.
[12] Semi-Supervised Learning, O. Chapelle B. Schölkopf and A. Zien, eds. MIT Press, 2006.
[13] O. Chapelle, V. Sindhwani, and S.S. Keerthi, "Optimization Techniques for Semi-Supervised Support Vector Machines," J. Machine Learning Research, vol. 9, pp. 203-233, 2008.
[14] O. Chapelle, J. Weston, and B. Schölkopf, "Cluster Kernels for Semi-Supervised Learning," Proc. Advances in Neural Information Processing Systems 15, 2001.
[15] F. Girosi, M. Jones, and T. Poggio, "Regularization Theory and Neural Networks Architectures," Neural Computation, vol. 7, pp. 219-269, 1995.
[16] X. He and P. Niyogi, "Locality Preserving Projections," Proc. Advances in Neural Information Processing Systems 18, 2004.
[17] T. Joachims, "Transductive Inference for Text Classification Using Support Vector Machines," Proc. 16th Int'l Conf. Machine Learning, 1999.
[18] I. Jolliffe, Principal Component Analysis. Springer 1986.
[19] S. Keerthi, V. Sindhwani, and O. Chapelle, "An Efficient Method for Gradient-Based Adaptation of Hyperparameters in SVM Models," Proc. Advances in Neural Information Processing Systems 19, 2007.
[20] R. Kohavi, "A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection," Proc. 14th Int'l Joint Conf. Artificial Intelligence, pp. 1137-1145, 1995.
[21] K. Nigam, A.K. McCallum, S. Thrun, and T. Mitchell, "Text Classification from Labeled and Unlabeled Documents Using Em," Machine Learning, vol. 39, nos. 2/3, pp. 103-134, 2000.
[22] S. Rosenberg, The Laplacian on a Riemmannian Manifolds. Cambridge Univ., 1997.
[23] C. Siagian and L. Itti, "Rapid Biologically-Inspired Scene Classification Using Features Shared with Visual Attention," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 29, no. 2, pp. 300-312, Feb. 2007.
[24] V. Sindhwani, S.S. Keerthi, and O. Chapelle, "Deterministic Annealing for Semi-Supervised Kernel Machines," Proc. 23rd Int'l Conf. Machine Learning, pp. 841-848, 2006.
[25] V. Sindhwani, P. Niyogi, and M. Belkin, "Beyond the Point Cloud: From Transductive to Semi-Supervised Learning," Proc. 22nd Int'l Conf. Machine Learning, 2005.
[26] A. Smola and R. Kondor, "Kernels and Regularization on Graphs," Proc. Conf. Learning Theory and Kernel Machines, 2003.
[27] S. Tong and D. Koller, "Support Vector Machine Active Learning with Applications to Text Classification," J. Machine Learning Research, vol. 2, pp. 999-1006, 2000.
[28] V.N. Vapnik, Statistical Learning Theory. Wiley, 1998.
[29] D. Yarowsky, "Unsupervised Word Sense Disambiguation Rivaling Supervised Methods," Proc. 33rd Ann. Meeting Assoc. for Computational Linguistics, pp. 189-196, 1995.
[30] D. Zhou, O. Bousquet, T.N. Lal, J. Weston, and B. Scholkopf, "Learning with Local and Global Consistency," Advances in Neural Information Processing Systems, vol. 16, pp. 321-328, 2004.
[31] X. Zhu, Z. Ghahramani, and J. Lafferty, "Semi-Supervised Learning Using Gaussian Fields and Hamonic Functions," Proc. 20th Int'l Conf. Machine Learning, 2003.

Index Terms:
matrix algebra,approximation theory,learning (artificial intelligence),deterministic matrix,ensemble manifold regularization framework,intrinsic manifold automatic approximation,general semisupervised learning problems,optimization function,optimal hyperparameters,cross validation,discrete grid search,composite manifold learning,candidate manifold hyperparameters,EMR convergence property,Manifolds,Laplace equations,Approximation methods,Kernel,Algorithm design and analysis,Support vector machines,Loss measurement,ensemble manifold regularization.,Manifold learning,semi-supervised learning
Chao Xu, Dacheng Tao, Bo Geng, Linjun Yang, Xian-Sheng Hua, "Ensemble Manifold Regularization," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 34, no. 6, pp. 1227-1233, June 2012, doi:10.1109/TPAMI.2012.57
Usage of this product signifies your acceptance of the Terms of Use.