CSDL Home IEEE Transactions on Pattern Analysis & Machine Intelligence 2011 vol.33 Issue No.10 - October

Subscribe

Issue No.10 - October (2011 vol.33)

pp: 2026-2038

Lijun Zhang , Zhejiang University, Hangzhou

Chun Chen , Zhejiang University, Hangzhou

Jiajun Bu , Zhejiang University, Hangzhou

Deng Cai , Zhejiang University, Hangzhou

Xiaofei He , Zhejiang University, Hangzhou

Thomas S. Huang , University of Illinois at Urbana Champaign, Urbana

DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TPAMI.2011.20

ABSTRACT

We consider the active learning problem, which aims to select the most representative points. Out of many existing active learning techniques, optimum experimental design (OED) has received considerable attention recently. The typical OED criteria minimize the variance of the parameter estimates or predicted value. However, these methods see only global euclidean structure, while the local manifold structure is ignored. For example, I-optimal design selects those data points such that other data points can be best approximated by linear combinations of all the selected points. In this paper, we propose a novel active learning algorithm which takes into account the local structure of the data space. That is, each data point should be approximated by the linear combination of only its neighbors. Given the local reconstruction coefficients for every data point and the coordinates of the selected points, a transductive learning algorithm called Locally Linear Reconstruction (LLR) is proposed to reconstruct every other point. The most representative points are thus defined as those whose coordinates can be used to best reconstruct the whole data set. The sequential and convex optimization schemes are also introduced to solve the optimization problem. The experimental results have demonstrated the effectiveness of our proposed method.

INDEX TERMS

Active learning, experimental design, local structure, reconstruction.

CITATION

Lijun Zhang, Chun Chen, Jiajun Bu, Deng Cai, Xiaofei He, Thomas S. Huang, "Active Learning Based on Locally Linear Reconstruction",

*IEEE Transactions on Pattern Analysis & Machine Intelligence*, vol.33, no. 10, pp. 2026-2038, October 2011, doi:10.1109/TPAMI.2011.20REFERENCES

- [1] X. Zhu, "Semi-Supervised Learning Literature Survey," Technical Report 1530, Dept. of Computer Sciences, Univ. of Wisconsin–Madison, 2005.
- [2] D. Zhou, O. Bousquet, T.N. Lal, J. Weston, and B. Schölkopf, "Learning with Local and Global Consistency,"
Advances in Neural Information Processing Systems, vol. 16, pp. 321-328, MIT Press, 2004.- [3] M. Belkin, P. Niyogi, and V. Sindhwani, "Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples,"
J. Machine Learning Research, vol. 7, pp. 2399-2434, 2006.- [4] D.A. Cohn, Z. Ghahramani, and M.I. Jordan, "Active Learning with Statistical Models,"
J. Artificial Intelligence Research, vol. 4, pp. 129-145, 1996.- [5] B. Settles, "Active Learning Literature Survey," Computer Sciences Technical Report 1648, Univ. of Wisconsin–Madison, 2009.
- [6] D.D. Lewis and W.A. Gale, "A Sequential Algorithm for Training Text Classifiers,"
Proc. ACM SIGIR, pp. 3-12, 1994.- [7] S. Tong and D. Koller, "Support Vector Machine Active Learning with Applications to Text Classification,"
J. Machine Learning Research, vol. 2, pp. 45-66, 2002.- [8] M. Lindenbaum, S. Markovitch, and D. Rusakov, "Selective Sampling for Nearest Neighbor Classifiers,"
Machine Learning, vol. 54, no. 2, pp. 125-152, 2004.- [9] A. Fujii, T. Tokunaga, K. Inui, and H. Tanaka, "Selective Sampling for Example-Based Word Sense Disambiguation,"
Computational Linguistics, vol. 24, no. 4, pp. 573-597, 1998.- [10] H.S. Seung, M. Opper, and H. Sompolinsky, "Query by Committee,"
Proc. Fifth Ann. Workshop Computational Learning Theory, pp. 287-294, 1992.- [11] P. Melville and R.J. Mooney, "Diverse Ensembles for Active Learning,"
Proc. 21st Int'l Conf. Machine learning, 2004.- [12] N. Roy and A. McCallum, "Toward Optimal Active Learning through Sampling Estimation of Error Reduction,"
Proc. 18th Int'l Conf. Machine Learning, pp. 441-448, 2001.- [13] X. Zhu, J. Lafferty, and Z. Ghahramani, "Combining Active Learning and Semi-Supervised Learning Using Gaussian Fields and Harmonic Functions,"
Proc. Int'l Conf. Machine Learning Workshop Continuum from Labeled to Unlabeled Data in Machine Learning and Data Mining, pp. 58-65, 2003.- [14] A. Atkinson, A. Donev, and R. Tobias,
Optimum Experimental Designs, with SAS. Oxford Univ. Press, 2007.- [15] X. He, W. Min, D. Cai, and K. Zhou, "Laplacian Optimal Design for Image Retrieval,"
Proc. ACM SIGIR, pp. 119-126, 2007.- [16] K. Yu, J. Bi, and V. Tresp, "Active Learning via Transductive Experimental Design,"
Proc. 23rd Int'l Conf. Machine Learning, pp. 1081-1088, 2006.- [17] L. Zhang, C. Chen, W. Chen, J. Bu, D. Cai, and X. He, "Convex Experimental Design Using Manifold Structure for Image Retrieval,"
Proc. 17th ACM Int'l Conf. Multimedia, pp. 45-53, 2009.- [18] S.T. Roweis and L.K. Saul, "Nonlinear Dimensionality Reduction by Locally Linear Embedding,"
Science, vol. 290, no. 5500, pp. 2323-2326, Dec. 2000.- [19] J. Tenenbaum, V. de Silva, and J. Langford, "A Global Geometric Framework for Nonlinear Dimensionality Reduction,"
Science, vol. 290, no. 5500, pp. 2319-2323, 2000.- [20] M. Belkin and P. Niyogi, "Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering,"
Advances in Neural Information Processing Systems, vol. 14, pp. 585-591, MIT Press, 2002.- [21] X. He and P. Niyogi, "Locality Preserving Projections,"
Advances in Neural Information Processing Systems, vol. 16, pp. 153-160, MIT Press, 2004.- [22] T. Hastie, R. Tibshirani, and J. Friedman,
The Elements of Statistical Learning. Springer, 2009.- [23] S.P. Asprey and S. Macchietto, "Designing Robust Optimal Dynamic Experiments,"
J. Process Control, vol. 12, no. 4, pp. 545-556, 2002.- [24] R.H. Hardin and N.J.A. Sloane, "A New Approach to the Construction of Optimal Designs,"
J. Statistical Planning and Inference, vol. 37, no. 3, pp. 339-369, 1993.- [25] S.B. Grary and C. Spera, "Optimal Experimental Design for Combinatorial Problems,"
Computational Economics, vol. 9, no. 3, pp. 241-255, Aug. 1996.- [26] X. Li and Y. Pang, "Deterministic Column-Based Matrix Decomposition,"
IEEE Trans. Knowledge and Data Eng., vol. 22, no. 1, pp. 145-149, Jan. 2010.- [27] C. Chen, L. Zhang, J. Bu, C. Wang, and W. Chen, "Constrained Laplacian Eigenmap for Dimensionality Reduction,"
Neurocomputing, vol. 73, nos. 4-6, pp. 951-958, 2010.- [28] M. Belkin, I. Matveeva, and P. Niyogi, "Regularization and Semi-Supervised Learning on Large Graphs,"
Proc. 17th Ann. Conf. Computational Learning Theory, pp. 624-638, 2004.- [29] X. Zhu, Z. Ghahramani, and J.D. Lafferty, "Semi-Supervised Learning Using Gaussian Fields and Harmonic Functions,"
Proc. 20th Int'l Conf. Machine Learning, pp. 912-919, 2003.- [30] C.J.C. Burges, "A Tutorial on Support Vector Machines for Pattern Recognition,"
Data Mining and Knowledge Discovery, vol. 2, no. 2, pp. 121-167, 1998.- [31] Y. Chen, T.A. Davis, W.W. Hager, and S. Rajamanickam, "Algorithm 887: Cholmod, Supernodal Sparse Cholesky Factorization and Update/Downdate,"
ACM Trans. Math. Software, vol. 35, no. 3, pp. 1-14, 2008.- [32] G.H. Golub and C.F. Van Loan,
Matrix Computations, third ed. Johns Hopkins Univ. Press, 1996.- [33] S. Boyd and L. Vandenberghe,
Convex Optimization. Cambridge Univ. Press, 2004.- [34] C.-C. Chang and C.-J. Lin LIBSVM: A Library for Support Vector Machines, http://www.csie.ntu.edu.tw/~cjlinlibsvm, 2001.
- [35] M. Grant, and S. Boyd, "CVX: Matlab Software for Disciplined Convex Programming, Version 1.21," http://cvxr.comcvx, Oct. 2010.
- [36] M. Grant and S. Boyd, "Graph Implementations for Nonsmooth Convex Programs,"
Recent Advances in Learning and Control, V. Blondel, S. Boyd, and H. Kimura, eds., vol. 371, pp. 95-110, Springer, 2008.- [37] H. Yu, M. Li, H. Jiang Zhang, and J. Feng, "Color Texture Moments for Content-Based Image Retrieval,"
Proc. Int'l Conf. Image Processing, pp. 24-28, 2002.- [38] M. Muja and D.G. Lowe, "Fast Approximate Nearest Neighbors with Automatic Algorithm Configuration,"
Proc. Int'l Conf. Computer Vision Theory and Application, pp. 331-340, 2009.- [39] S. Arya, D.M. Mount, N.S. Netanyahu, R. Silverman, and A.Y. Wu, "An Optimal Algorithm for Approximate Nearest Neighbor Searching Fixed Dimensions,"
J. ACM, vol. 45, no. 6, pp. 891-923, 1998.- [40] H. Lee, A. Battle, R. Raina, and A.Y. Ng, "Efficient Sparse Coding Algorithms,"
Advances in Neural Information Processing Systems, vol. 19, pp. 801-808, MIT Press, 2007.- [41] J. Wright, A.Y. Yang, A. Ganesh, S.S. Sastry, and Y. Ma, "Robust Face Recognition via Sparse Representation,"
IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 31, no. 2, pp. 210-227, Feb. 2009. |