The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.04 - April (2013 vol.25)
pp: 906-918
Xiaoxiao Shi , University of Illinois at Chicago, Chicago
Qi Liu , Tongji University, Shanghai
Wei Fan , IBM T.J. Watson Research Center, Hawthorne
Philip S. Yu , University of Illinois at Chicago, Chicago
ABSTRACT
In many applications, it is very expensive or time consuming to obtain a lot of labeled examples. One practically important problem is: can the labeled data from other related sources help predict the target task, even if they have 1) different feature spaces (e.g., image versus text data), 2) different data distributions, and 3) different output spaces? This paper proposes a solution and discusses the conditions where this is highly likely to produce better results. It first unifies the feature spaces of the target and source data sets by spectral embedding, even when they are with completely different feature spaces. The principle is to devise an optimization objective that preserves the original structure of the data, while at the same time, maximizes the similarity between the two. A linear projection model, as well as a nonlinear approach are derived on the basis of this principle with closed forms. Second, a judicious sample selection strategy is applied to select only those related source examples. At last, a Bayesian-based approach is applied to model the relationship between different output spaces. The three steps can bridge related heterogeneous sources in order to learn the target task. Among the 20 experiment data sets, for example, the images with wavelet-transformed-based features are used to predict another set of images whose features are constructed from color-histogram space; documents are used to help image classification, etc. By using these extracted examples from heterogeneous sources, the models can reduce the error rate by as much as 50 percent, compared with the methods using only the examples from the target task.
INDEX TERMS
Optimization, Data models, Bridges, Vectors, Training data, Training, Bioinformatics, transfer learning, Feature generation, heterogeneous data
CITATION
Xiaoxiao Shi, Qi Liu, Wei Fan, Philip S. Yu, "Transfer across Completely Different Feature Spaces via Spectral Embedding", IEEE Transactions on Knowledge & Data Engineering, vol.25, no. 4, pp. 906-918, April 2013, doi:10.1109/TKDE.2011.252
REFERENCES
[1] A. Argyriou, T. Evgeniou, and M. Pontil, "Multi-Task Feature Learning," Proc. 19th Ann. Conf. Neural Information Processing Systems, pp. 41-48, Dec. 2007.
[2] A. Argyriou, C.A. Micchelli, M. Pontil, and Y. Ying, "A Spectral Regularization Framework for Multi-Task Structure Learning," Proc. 20th Ann. Conf. Neural Information Processing Systems, pp. 25-32, 2008.
[3] S. Ben-David, J. Blitzer, K. Crammer, and F. Pereira, "Analysis of Representations for Domain Adaptation," Proc. 20th Ann. Conf. Neural Information Processing Systems, pp. 137-144, 2007.
[4] R. Bhatia, Matrix Analysis. Springer-Verlag, 1997.
[5] S. Bickel, J. Bogojeska, T. Lengauer, and T. Scheffer, "Multi-Task Learning for HIV Therapy Screening," Proc. 25th Int'l Conf. Machine Learning, pp. 56-63, 2008.
[6] A. Blum and T. Mitchell, "Combining Labeled and Unlabeled Data with Co-Training," Proc. 11th Ann. Conf. Computational Learning Theory, pp. 92-100, 1998.
[7] R. Caruana, "Multitask Learning, Machine Learning," vol. 28, no. 1, pp. 41-75, 1997.
[8] W. Dai, Y. Chen, G.-R. Xue, Q. Yang, and Y. Yu, "Translated Learning," Proc. 21st Ann. Conf. Neural Information Processing Systems, 2008.
[9] W. Dai, G. Xue, Q. Yang, and Y. Yu, "Co-Clustering Based Classification for Out-of-Domain Documents," Proc. 13th ACM SIGKDD Int'l Conf. Knowledge Discovery and Data Mining, Aug. 2007.
[10] J. Davis and P. Domingos, "Deep Transfer via Second-Order Markov Logic," Proc. Assoc. for the Advancement of Artificial Intelligence (AAAI '08) Workshop Transfer Learning for Complex Tasks, July 2008.
[11] H. DaumeIII, "Frustratingly Easy Domain Adaptation," Proc. 45th Ann. Meeting of the Assoc. Computational Linguistics, pp. 256-263, June 2007.
[12] J. Gao, W. Fan, J. Jiang, and J. Han, "Knowledge Transfer via Multiple Model Local Structure Mapping," Proc. 14th ACM SIGKDD Int'l Conf. Knowledge Discovery and Data Mining, pp. 283-291, Aug. 2008.
[13] T. Jebara, "Multi-Task Feature and Kernel Selection for SVMs," Proc. 21st Int'l Conf. Machine Learning, July 2004.
[14] S.I. Lee, V. Chatalbashev, D. Vickrey, and D. Koller, "Learning a Meta-Level Prior for Feature Relevance from Multiple Related Tasks," Proc. 24th Int'l Conf. Machine Learning, pp. 489-496, July 2007.
[15] X. Ling, W. Dai, G.-R. Xue, Q. Yang, and Y. Yu, "Spectral Domain- Transfer Learning," Proc. 14th ACM SIGKDD Int'l Conf. Knowledge Discovery and Data Mining, pp. 488-496, Aug. 2008.
[16] B. Long, P.S. Yu, and Z. Zhang, "A General Model for Multiple View Unsupervised Learning," Proc. SIAM Conf. Data Mining, pp. 822-833, 2009.
[17] K. Nigam and R. Ghani, "Analyzing the Effectiveness and Applicability of Co-Training," Proc. ACM Int'l Conf. Information and Knowledge Management, pp. 86-93, 2000.
[18] S.J. Pan, I.W. Tsang, J.T. Kwok, and Q. Yang, "Domain Adaptation via Transfer Component Analysis," Proc. 21st Int'l Joint Conf. Artificial Intelligence, 2009.
[19] S.J. Pan and Q. Yang, "A Survey on Transfer Learning," IEEE Trans. Knowledge and Data Eng., vol. 22, no. 10, pp. 1345-1359, Oct. 2010.
[20] R. Raina, A. Battle, H. Lee, B. Packer, and A.Y. Ng, "Self-Taught Learning: Transfer Learning from Unlabeled Data," Proc. 24th Int'l Conf. Machine Learning, pp. 759-766, June 2007.
[21] X. Shi, W. Fan, and J. Ren, "Actively Transfer Domain Knowledge," Proc. European Conf. Machine Learning and Principles and Practices of Knowledge Discovery in Databases (ECML/PKDD '08), pp. 342-357, 2008.
[22] X. Shi, Q. Liu, W. Fan, Q. Yang, and P.S. Yu, "Predictive Modeling with Heterogeneous Sources," Proc. SIAM Conf. Data Mining, pp. 814-825, 2010.
[23] L. Sun, S. Ji, and J. Ye, "A Least Squares Formulation for Canonical Correlation Analysis," Proc. 25th Int'l Conf. Machine Learning, pp. 1024-1031, 2008.
[24] C. Wang and S. Mahadevan, "Manifold Alignment Using Procrustes Analysis," Proc. 25th Int'l Conf. Machine Learning, pp. 1120-1127, July 2008.
[25] Q. Yang, Y. Chen, G. Xue, W. Dai, and Y. Yu, "Heterogeneous Transfer Learning for Image Clustering via the Social Web," Proc. 47th Ann. Meeting of the ACL and the Fourth IJCNLP of the AFNLP, pp. 1-9, 2009.
[26] X. Zhu, "Semi-Supervised Learning Literature Survey," Technical Report 1530, Univ. of Wisconsin-Madison, 2006.
[27] Y. Zhu, Y. Chen, Z. Lu, S.J. Pan, G. Xue, Y. Yu, and Q. Yang, "Heterogeneous Transfer Learning for Image Classification," Proc. 25th AAAI Conf. Artificial Intelligence, 2011.
40 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool