
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
ShenShyang Ho, Harry Wechsler, "Query by Transduction," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 30, no. 9, pp. 15571571, September, 2008.  
BibTex  x  
@article{ 10.1109/TPAMI.2007.70811, author = {ShenShyang Ho and Harry Wechsler}, title = {Query by Transduction}, journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence}, volume = {30}, number = {9}, issn = {01628828}, year = {2008}, pages = {15571571}, doi = {http://doi.ieeecomputersociety.org/10.1109/TPAMI.2007.70811}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Pattern Analysis and Machine Intelligence TI  Query by Transduction IS  9 SN  01628828 SP1557 EP1571 EPD  15571571 A1  ShenShyang Ho, A1  Harry Wechsler, PY  2008 KW  Machine learning KW  Statistical VL  30 JA  IEEE Transactions on Pattern Analysis and Machine Intelligence ER   
[1] V.N. Vapnik, The Nature of Statistical Learning Theory, second ed. Springer, 2000.
[2] T. Joachims, “Transductive Inference for Text Classification Using Support Vector Machines,” Proc. 16th Int'l Conf. Machine Learning, I. Bratko and S. Dzeroski, eds., pp. 200209, 1999.
[3] F. Li and H. Wechsler, “Open Set Face Recognition Using Transduction,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 27, no. 11, pp. 16861697, Nov. 2005.
[4] M. Okabe, K. Umemura, and S. Yamada, “Query Expansion with the Minimum User Feedback by Transductive Learning,” Proc. Human Language Technology Conf. and Conf. Empirical Methods in Natural Language Processing (HLT/EMNLP '05), pp. 963970, 2005.
[5] R. Craig and L. Liao, “Protein Classification Using Transductive Learning on Phylogenetic Profiles,” Proc. ACM Symp. Applied Computing, pp. 161166, 2006.
[6] S. Tong and D. Koller, “Support Vector Machine Active Learning with Applications to Text Classification,” J. Machine Learning Research, vol. 2, pp. 4566, 2001.
[7] V. Vovk, A. Gammerman, and G. Shafer, Algorithmic Learning in a Random World. Springer, 2005.
[8] M. Li and P. Vitanyi, An Introduction to Kolmogorov Complexity and Its Applications, second ed. Springer, 1997.
[9] A. Gammerman and V. Vovk, “Prediction Algorithms and Confidence Measures Based on Algorithmic Randomness Theory,” Theoretical Computer Science, vol. 287, no. 1, pp. 209217, 2002.
[10] S. Kullback, Information Theory and Statistics. John Wiley & Sons, 1959.
[11] Y. Freund, H.S. Seung, E. Shamir, and N. Tishby, “Selective Sampling Using the Query by Committee Algorithm,” Machine Learning, vol. 28, nos. 23, pp. 133168, 1997.
[12] D.A. Cohn, Z. Ghahramani, and M.I. Jordan, “Active Learning with Statistical Models,” J. Artificial Intelligence Research, vol. 4, pp.129145, 1996.
[13] T. Zhang and F. Oles, “A Probability Analysis on the Value of Unlabeled Data for Classification Problems,” Proc. 17th Int'l Conf. Machine Learning, pp. 11911198, 2000.
[14] M. Li and I. Sethi, “ConfidenceBased Active Learning,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 28, no. 8, pp.12511261, Aug. 2006.
[15] D.D. Lewis and J. Catlett, “Heterogeneous Uncertainty Sampling for Supervised Learning,” Proc. 11th Int'l Conf. Machine Learning, pp. 148156, 1994.
[16] D. Mackay, “InformationBased Objective Functions for Active Data Selection,” Neural Computation, vol. 4, no. 4, pp. 590604, 1992.
[17] C. Zhang and T. Chen, “An Active Learning Framework for ContentBased Information Retrieval,” IEEE Trans. Multimedia, vol. 4, no. 2, pp. 260268, 2002.
[18] H.S. Seung, M. Opper, and H. Sompolinsky, “Query by Committee,” Proc. Fifth Ann. Conf. Learning Theory, pp. 287294, 1992.
[19] X. Zhu, J. Lafferty, and Z. Ghahramani, “Combining Active Learning and SemiSupervised Learning Using Gaussian Fields and Harmonic Functions,” Proc. ICML Workshop Continuum from Labeled to Unlabeled Data in Machine Learning and Data Mining, 2003.
[20] N. Roy and A. McCallum, “Toward Optimal Active Learning through Sampling Estimation of Error Reduction,” Proc. 18th Int'l Conf. Machine Learning, pp. 441448, 2001.
[21] R. Yan, J. Yang, and A.G. Hauptmann, “Automatically Labeling Video Data Using MultiClass Active Learning,” Proc. Ninth Int'l Conf. Computer Vision, pp. 516523, 2003.
[22] G. Schohn and D. Cohn, “Less Is More: Active Learning with Support Vector Machines,” Proc. 17th Int'l Conf. Machine Learning, pp. 839846, 2000.
[23] C. Campbell, N. Cristianini, and A.J. Smola, “Query Learning with Large Margin Classifiers,” Proc. 17th Int'l Conf. Machine Learning, pp. 111118, 2000.
[24] K. Brinker, “Active Learning with Kernel Machines,” PhD dissertation, Univ. of Paderborn, 2004.
[25] P. Mitra, C.A. Murthy, and S.K. Pal, “A Probabilistic Active Support Vector Learning Algorithm,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 26, no. 3, pp. 413418, Mar. 2004.
[26] N. Abe and H. Mamitsuka, “Query Learning Strategies Using Boosting and Bagging,” Proc. 15th Int'l Conf. Machine Learning, pp.19, 1998.
[27] I. Dagan and S. Engelson, “CommitteeBased Sampling for Training Probabilistic Classifiers,” Proc. 12th Int'l Conf. Machine Learning, pp. 150157, 1995.
[28] G. Tur, R.E. Schapire, and D. HakkaniTur, “Active Learning for Spoken Language Understanding,” Proc. IEEE Int'l Conf. Acoustics, Speech and Signal Processing, 2003.
[29] P. Melville and R. Mooney, “Diverse Ensembles for Active Learning,” Proc. 21st Int'l Conf. Machine Learning, pp. 584591, 2004.
[30] R. GiladBachrach, A. Navot, and N. Tishby, “Query by Committee Made Real,” Proc. Ann. Conf. Advances in Neural Information Processing Systems (NIPS '05), 2005.
[31] S. Dasgupta, A. Kalai, and C. Monteleoni, “Analysis of PerceptronBased Active Learning,” Proc. 18th Ann. Conf. Learning Theory, 2005.
[32] A. McCallum and K. Nigam, “Employing EM and PoolBased Active Learning for Text Classification,” Proc. 15th Int'l Conf. Machine Learning, pp. 359367, 1998.
[33] P. Melville, S. Yang, M. SaarTsechansky, and R. Mooney, “Active Learning for Probability Estimation Using JensenShannon Divergence,” Proc. European Conf. Machine Learning, pp. 268279, 2005.
[34] R. Yan, J. Yang, and A. Hauptmann, “Automatically Labeling Video Data Using MultiClass Active Learning,” Proc. Ninth IEEE Int'l Conf. Computer Vision, pp. 516523, 2003.
[35] K.S. Goh, E.Y. Chang, and W.C. Lai, “Multimodal ConceptDependent Active Learning for Image Retrieval,” Proc. 12th ACM Int'l Conf. Multimedia, pp. 564571, 2004.
[36] S. Tong and E.Y. Chang, “Support Vector Machine Active Learning for Image Retrieval,” Proc. ACM Multimedia, pp. 107118, 2001.
[37] T. Luo, K. Kramer, D.B. Goldgof, L.O. Hall, S. Samson, A. Remsen, and T. Hopkins, “Active Learning to Recognize Multiple Types of Plankton,” J. Machine Learning Research, vol. 6, pp. 589613, 2005.
[38] M.K. Warmuth, J. Liao, G. Raetsch, M. Mathieson, S. Putta, and C. Lemmen, “Active Learning with Support Vector Machines in the Drug Discovery Process,” J. Chemical Information and Computer Sciences, vol. 43, pp. 667673, 2003.
[39] C. Dima and M. Hebert, “Active Learning for Outdoor Obstacle Detection,” Robotics: Science and Systems, pp. 916, 2005.
[40] R. Yan and A. Hauptmann, “MultiClass Active Learning for Video Semantic Feature Extraction,” Proc. IEEE Int'l Conf. Multimedia and Expo, pp. 6772, 2004.
[41] S.C.H. Hoi, R. Jin, J. Zhu, and M.R. Lyu, “Batch Mode Active Learning and Its Application to Medical Image Classification,” Proc. 23rd Int'l Conf. Machine Learning, pp. 417424, 2006.
[42] S.C.H. Hoi, R. Jin, and M.R. Lyu, “LargeScale Text Categorization by Batch Mode Active Learning,” Proc. 15th Int'l Conf. World Wide Web, pp. 633642, 2006.
[43] Y. Baram, R. Yaniv, and K. Luz, “Online Choice of Active Learning Algorithms,” J. Machine Learning Research, pp. 255291, 2004.
[44] R. Kothari and V. Jain, “Learning from Labeled and Unlabeled Data Using a Minimal Number of Queries,” IEEE Trans. Neural Networks, vol. 14, no. 6, 2003.
[45] V. Cherkassky and F. Mulier, Learning from Data: Concepts, Theory, and Methods. John Wiley & Sons, 1998.
[46] SemiSupervised Learning, O. Chapelle, B. Schölkopf, and A. Zien, eds. MIT Press, 2006.
[47] K. Yu, J. Bi, and V. Tresp, “Active Learning via Transductive Experimental Design,” Proc. 23rd Int'l Conf. Machine Learning, pp.10811088, 2006.
[48] S.S. Ho and H. Wechsler, “Transductive Confidence Machines for Active Learning,” Proc. Int'l Joint Conf. Neural Network (IJCNN '03), 2003.
[49] V. Vovk, A. Gammerman, and C. Saunders, “MachineLearning Applications of Algorithmic Randomness,” Proc. 16th Int'l Conf. Machine Learning, I. Bratko and S. Dzeroski, eds., pp. 444453, 1999.
[50] S. Weerahandi, Exact Statistical Methods for Data Analysis. Springer, 1994.
[51] K. Proedrou, I. Nouretdinov, V. Vovk, and A. Gammerman, “Transductive Confidence Machines for Pattern Recognition,” Proc. 13th European Conf. Machine Learning, T. Elomaa, H. Mannila, and H. Toivonen, eds., pp. 381390, 2002.
[52] C. Saunders, A. Gammerman, and V. Vovk, “Transduction with Confidence and Credibility,” Proc. 16th Int'l Joint Conf. Artificial Intelligence, T. Dean, ed., pp. 722726, 1999.
[53] T. Melluish, C. Saunders, I. Nouretdinov, and V. Vovk, “Comparing the Bayes and Typicalness Frameworks,” Proc. 12th European Conf. Machine Learning, pp. 360371, 2001.
[54] G. Cauwenberghs and T. Poggio, “Incremental Support Vector Machine Learning,” Advances in Neural Information Processing Systems 13, pp. 409415. MIT Press, 2000.
[55] S.S. Ho and H. Wechsler, “Learning from Data Streams via Online Transduction,” Proc. ICDM Workshop Temporal Data Mining: Algorithms, Theory and Applications (TDM '04), 2004.
[56] T. Sellke, M.J. Bayarri, and J.O. Berger, “Calibration of pvalues for Testing Precise Null Hypotheses,” The Am. Statistician, vol. 55, pp.6271, 2001.
[57] T. Graepel, R. Herbrich, and K. Obermayer, “Bayesian Transduction,” Proc. Ann. Conf. Advances in Neural Information Processing Systems (NIPS '99), S.A. Solla, T.K. Leen, and K.R. Müler, eds., pp.456462, 1999.
[58] T. Graepel and R. Herbrich, “The Kernel Gibbs Sampler,” Proc. Ann. Conf. Advances in Neural Information Processing Systems (NIPS '00), T.K. Leen, T.G. Dietterich, and V. Tresp, eds., pp. 514520, 2000.
[59] Y. LeCun, B. Boser, J.S. Denker, D. Henderson, R.E. Howard, W. Hubbard, and L.J. Jackel, “Backpropagation Applied to Handwritten Zip Code Recognition,” Neural Computation, vol. 1, pp.541551, 1989.
[60] G. Rätsch, T. Onoda, and K.R. Müler, “Soft Margins for Adaboost,” Machine Learning, vol. 42, no. 3, pp. 287320, 2001.
[61] P. Frey and D. Slate, “Letter Recognition Using HollandStyle Adaptive Classifiers,” Machine Learning, vol. 6, pp. 161182, 1991.