
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
Olvi L. Mangasarian, David R. Musicant, "Robust Linear and Support Vector Regression," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 9, pp. 950955, September, 2000.  
BibTex  x  
@article{ 10.1109/34.877518, author = {Olvi L. Mangasarian and David R. Musicant}, title = {Robust Linear and Support Vector Regression}, journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence}, volume = {22}, number = {9}, issn = {01628828}, year = {2000}, pages = {950955}, doi = {http://doi.ieeecomputersociety.org/10.1109/34.877518}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Pattern Analysis and Machine Intelligence TI  Robust Linear and Support Vector Regression IS  9 SN  01628828 SP950 EP955 EPD  950955 A1  Olvi L. Mangasarian, A1  David R. Musicant, PY  2000 KW  Support vector machines KW  regression KW  Huber Mestimator KW  kernel methods. VL  22 JA  IEEE Transactions on Pattern Analysis and Machine Intelligence ER   
Abstract—The robust Huber Mestimator, a differentiable cost function that is quadratic for small errors and linear otherwise, is modeled exactly, in the original primal space of the problem, by an easily solvable simple convex quadratic program for both linear and nonlinear support vector estimators. Previous models were significantly more complex or formulated in the dual space and most involved specialized numerical algorithms for solving the robust Huber linear estimator [3], [6], [12], [13], [14], [23], [28]. Numerical test comparisons with these algorithms indicate the computational effectiveness of the new quadratic programming model for both linear and nonlinear support vector problems. Results are shown on problems with as many as 20,000 data points, with considerably faster running times on larger problems.
[1] “Adult Dataset,” U.S. Census Bureau, publicly available from:www.sgi.com/Technology/mlcdb/.
[2] V. Cherkassky and F. Mulier, Learning from Data—Concepts, Theory, and Methods. New York: John Wiley&Sons, 1998.
[3] D.I. Clark and M.R. Osborne, “Finite Algorithms for Huber's MEstimator,” SIAM J. Scientific and Statistical Computing, vol. 7, pp. 7285, 1986.
[4] N. Cristianini and J. ShaweTaylor, An Introduction to Support Vector Machines. Cambridge, U.K.: Cambridge Univ. Press, 2000.
[5] “Delve Data for Evaluating Learning in Valid Experiments,” http://www.cs.utoronto.ca~delve/.
[6] H. Ekblom, “A New Algorithm for the Huber Estimator in Linear Models,” BIT, vol. 28, pp. 123132, 1988.
[7] D. Gale, The Theory of Linear Economic Models. New York: McGrawHill, 1960.
[8] P.W. Holland and R.E. Welsch, “Robust Regression Using Iteratively Reweighted Least Squares,” Comm. Statistics—Theory and Methods, vol. A6, pp. 813827, 1977.
[9] P.J. Huber, Robust Statistics. New York: John Wiley, 1981.
[10] P.J. Huber and R. Dutter, “Numerical Solution of Robust Regression Problems,” Proc. Symp. Computational Statistics, G. Brushmann, ed., pp. 165172, 1974.
[11] ILOG CPLEX 6. 5 Reference Manual, ILOG CPLEX Division, Incline Village, Nev., 1999
[12] W. Li, “Numerical Estimates for the Huber MEstimator Problem,” Approximation Theory VIII, C.K. Chui and L.L. Schumaker, eds., pp. 325334, New York: World Scientific Publishing, 1995.
[13] W. Li and J.J. Swetits, “The Linear$\ell_1$Estimator and the Huber MEstimator,” SIAM J. Optimization, vol. 8, pp. 457475, 1998.
[14] K. Madsen and H.B. Nielsen, “Finite Algorithms for Robust Linear Regression,” BIT, vol. 30, pp. 682699, 1990.
[15] O.L. Mangasarian, Nonlinear Programming. Philadelphia: SIAM, 1994.
[16] O.L. Mangasarian, “Generalized Support Vector Machines,” Advances in Large Margin Classifiers, A. Smola, P. Bartlett, B. Schölkopf, and D. Schuurmans, eds., pp. 135146, Cambridge, Mass: MIT Press, 2000. ftp://ftp.cs.wisc.edu/mathprog/techreports 9814.ps.
[17] O.L. Mangasarian and R.R. Meyer, “Nonlinear Perturbation of Linear Programs,” SIAM J. Control and Optimization, vol. 17, no. 6, pp. 745752, Nov. 1979.
[18] O.L. Mangasarian and D.R. Musicant, “Data Discrimination via Nonlinear Generalized Support Vector Machines,” Technical Report 9903, Computer Sciences Dept., Univ. Wisconsin, Madison, Mar. 1999. To appear in: Applications and Algorithms of Complementarity, M.C. Ferris, O.L. Mangasarian, and J.S. Pang, eds., Boston: Kluwer Academic Publishers, 2000.ftp://ftp.cs.wisc.edu/mathprog/techreports 9903.ps.
[19] O.L. Mangasarian and D.R. Musicant, “Massive Support Vector Regression,” Technical Report 9902, Data Mining Institute, Computer Sciences Dept., Univ. Wisconsin, Madison, July 1999. ftp://ftp.cs.wisc.edu/pub/dmi/techreports 9902.ps.
[20] O.L. Mangasarian and T.H. Shiau, “Lipschitz Continuity of Solutions of Linear Inequalities, Programs and Complementarity Problems,” SIAM J. Control and Optimization, vol. 25, no. 3, pp. 583595, May 1987.
[21] “MATLAB,” User's Guide, Natick, Mass.: The MathWorks, Inc., 1992.
[22] “MATLAB,” Application Program Interface Guide, Natick, Mass.: The MathWorks, Inc., 1997.
[23] C. Michelot and M.L. Bougeard, “Duality Results and Proximal Solutions of the Huber MEstimator Problem,” Applied Math. and Optimization, vol. 30, pp. 203221, 1994.
[24] P.M. Murphy and D.W. Aha, “UCI Repository of Machine Learning Databases,” 1992. www.ics.uci.edu/~mlearnMLRepository.html .
[25] B.T. Polyak, Introduction to Optimization, Optimization Software, Inc., New York: Publications Division, 1987.
[26] B. Schölkopf, P. Bartlett, A. Smola, and R. Williamson, “Support Vector Regression with Automatic Accuracy Control,” Proc. Eighth Int'l Conf. Artificial Neural Networks, L. Niklasson, M. Boden, and T. Ziemke, eds., pp. 111116, 1998. http:/svm.first.gmd.de.
[27] B. Schölkopf, P. Bartlett, A. Smola, and R. Williamson, “Shrinking the Tube: A New Support Vector Regression Algorithm,” technical report, GMD FIRST, Berlin, Germany, 1999. http:/svm.first.gmd.de.
[28] D.F. Shanno and D.M. Rocke, “Numerical Methods for Robust Regression: Linear Models.” SIAM J. Scientific and Statistical Computing, vol. 7, pp. 8697, 1986.
[29] A. Smola, “Regression Estimation with Support Vector Learning Machines,” master's thesis, Technische Universität München, München, Germany, 1996.
[30] A. Smola, B. Schölkopf, and G. Rätsch, “Linear Programs for Automatic Accuracy Control in Regression,” technical report, GMD FIRST, Berlin, Germany, 1999. http:/svm.first.gmd.de/.
[31] A.J. Smola, “Learning with Kernels,” PhD thesis, Technische Universität Berlin, Germany, 1998.
[32] W.N. Street and O.L. Mangasarian, “Improved Generalization via Tolerant Training,” J. Optimization Theory and Applications, vol. 96, no. 2, pp. 259279, Feb. 1998. .
[33] V.N. Vapnik, Statistical Learning Theory, John Wiley&Sons, 1998.
[34] S.J. Wright, “Using Complementarity and Optimization Methods in Statistics,” Proc. Int'l Conf. Complementarity Problems, June 1999.
[35] S.J. Wright, “On Reduced Convex QP Formulations of Monotone LCP Problems,” Technical Report ANL/MCSP8080400, Argonne Nat'l Laboratory, Apr. 2000.