
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
Vikas C. Raykar, Ramani Duraiswami, Balaji Krishnapuram, "A Fast Algorithm for Learning a Ranking Function from LargeScale Data Sets," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 30, no. 7, pp. 11581170, July, 2008.  
BibTex  x  
@article{ 10.1109/TPAMI.2007.70776, author = {Vikas C. Raykar and Ramani Duraiswami and Balaji Krishnapuram}, title = {A Fast Algorithm for Learning a Ranking Function from LargeScale Data Sets}, journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence}, volume = {30}, number = {7}, issn = {01628828}, year = {2008}, pages = {11581170}, doi = {http://doi.ieeecomputersociety.org/10.1109/TPAMI.2007.70776}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Pattern Analysis and Machine Intelligence TI  A Fast Algorithm for Learning a Ranking Function from LargeScale Data Sets IS  7 SN  01628828 SP1158 EP1170 EPD  11581170 A1  Vikas C. Raykar, A1  Ramani Duraiswami, A1  Balaji Krishnapuram, PY  2008 KW  Machine learning KW  Algorithms VL  30 JA  IEEE Transactions on Pattern Analysis and Machine Intelligence ER   
[1] A. MasColell, M. Whinston, and J. Green, Microeconomic Theory. Oxford Univ. Press, 1995.
[2] G. Fung, R. Rosales, and B. Krishnapuram, “Learning Rankings via Convex Hull Separation,” Advances in Neural Information Processing Systems 18, Y. Weiss, B. Schölkopf, and J. Platt, eds. MIT Press, 2006.
[3] O. Dekel, C. Manning, and Y. Singer, “LogLinear Models for Label Ranking,” Advances in Neural Information Processing Systems 16, S. Thrun, L. Saul, and B. Schölkopf, eds. MIT Press, 2004.
[4] F. Wilcoxon, “Individual Comparisons by Ranking Methods,” Biometrics Bull., vol. 1, no. 6, pp. 8083, Dec. 1945.
[5] H.B. Mann and D.R. Whitney, “On a Test of Whether One of Two Random Variables is Stochastically Larger than the Other,” The Annals of Math. Statistics, vol. 18, no. 1, pp. 5060, 1947.
[6] C. Burges, T. Shaked, E. Renshaw, A. Lazier, M. Deeds, N. Hamilton, and G. Hullender, “Learning to Rank Using Gradient Descent,” Proc. 22nd Int'l Conf. Machine Learning, 2005.
[7] Y. Freund, R. Iyer, and R. Schapire, “An Efficient Boosting Algorithm for Combining Preferences,” J. Machine Learning Research, vol. 4, pp. 933969, 2003.
[8] L. Greengard, “Fast Algorithms for Classical Physics,” Science, vol. 265, no. 5174, pp. 909914, 1994.
[9] R. Herbrich, T. Graepel, P. BollmannSdorra, and K. Obermayer, “Learning Preference Relations for Information Retrieval,” Proc. Int'l Conf. Machine Learning Workshop Learning for Text Categorization, pp. 8084, 1998.
[10] T. Joachims, “Optimizing Search Engines Using Clickthrough Data,” Proc. Eighth ACM SIGKDD Int'l Conf. Knowledge Discovery and Data Mining, pp. 133142, 2002.
[11] W. Chu and Z. Ghahramani, “Preference Learning with Gaussian Processes,” Proc. 22nd Int'l Conf. Machine Learning, pp. 137144, 2005.
[12] R. Yan and A. Hauptmann, “Efficient MarginBased Rank Learning Algorithms for Information Retrieval,” Proc. Int'l Conf. Image and Video Retrieval, 2006.
[13] C. Burges, R. Ragno, and Q. Le, “Learning to Rank with Nonsmooth Cost Functions,” Advances in Neural Information Processing Systems 19, B. Schölkopf, J. Platt, and T. Hoffman, eds. MIT Press, 2007.
[14] K. Crammer and Y. Singer, “Pranking with Ranking,” Advances in Neural Information Processing Systems, vol. 14, pp. 641647, 2002.
[15] E.F. Harrington, “Online Ranking/Collaborative Filtering Using the Perceptron Algorithm,” Proc. 20th Int'l Conf. Machine Learning, 2003.
[16] R. Caruana, S. Baluja, and T. Mitchell, “Using the Future to ‘Sort Out’ the Present: Rankprop and Multitask Learning for Medical Risk Evaluation,” Advances in Neural Information Processing Systems, 1995.
[17] L. Yan, R. Dodier, M. Mozer, and R. Wolniewicz, “Optimizing Classifier Performance via an Approximation to the WilcoxonMannWhitney Statistic,” Proc. 20th Int'l Conf. Machine Learning, pp. 848855, 2003.
[18] A. Rakotomamonjy, “Optimizing Area under the ROC Curve with SVMs,” ROC Analysis in Artificial Intelligence, pp. 7180, 2004.
[19] U. Brefeld and T. Scheffer, “AUC Maximizing Support Vector Learning,” Proc. ICML 2005 Workshop ROC Analysis in Machine Learning, 2005.
[20] A. Herschtal and B. Raskutti, “Optimising Area under the ROC Curve Using Gradient Descent,” Proc. 21st Int'l Conf. Machine Learning, 2004.
[21] R. Herbrich, T. Graepel, and K. Obermayer,“Large Margin Rank Boundaries for Ordinal Regression,” Advances in Large Margin Classifiers, pp. 115132, MIT Press, 2000.
[22] J. Nocedal and S.J. Wright, Numerical Optimization. Springer, 1999.
[23] M. Abramowitz and I.A. Stegun, Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables. Dover, 1972.
[24] N.C. Beauliu, “A Simple Series for Personal Computer Computation of the Error Function $Q(.)$ ,” IEEE Trans. Comm., vol. 37, no. 9, pp. 989991, Sept. 1989.
[25] C. Tellambura and A. Annamalai, “Efficient Computation of erfc(x) for Large Arguments,” IEEE Trans. Comm., vol. 48, no. 4, pp. 529532, Apr. 2000.
[26] C. Wei and Z. Ghahramani, “Gaussian Processes for Ordinal Regression,” The J. Machine Learning Research, vol. 6, pp. 10191041, 2005.
[27] A.G. Gray and A.W. Moore, “Nonparametric Density Estimation: Toward Computational Tractability,” Proc. SIAM Int'l Conf. Data Mining, 2003.
[28] C. Yang, R. Duraiswami, and L. Davis, “Efficient Kernel Machines Using the Improved Fast Gauss Transform,” Advances in Neural Information Processing Systems 17, L.K. Saul, Y. Weiss, and L.Bottou, eds, pp. 15611568, MIT Press, 2005.
[29] V.C. Raykar and R. Duraiswami, The Improved Fast Gauss Transform with Applications to Machine Learning, Large Scale Kernel Machines, pp. 175201, MIT Press, 2007.