This Article 
 Bibliographic References 
 Add to: 
Robust Linear and Support Vector Regression
September 2000 (vol. 22 no. 9)
pp. 950-955

Abstract—The robust Huber M-estimator, a differentiable cost function that is quadratic for small errors and linear otherwise, is modeled exactly, in the original primal space of the problem, by an easily solvable simple convex quadratic program for both linear and nonlinear support vector estimators. Previous models were significantly more complex or formulated in the dual space and most involved specialized numerical algorithms for solving the robust Huber linear estimator [3], [6], [12], [13], [14], [23], [28]. Numerical test comparisons with these algorithms indicate the computational effectiveness of the new quadratic programming model for both linear and nonlinear support vector problems. Results are shown on problems with as many as 20,000 data points, with considerably faster running times on larger problems.

[1] “Adult Dataset,” U.S. Census Bureau, publicly available
[2] V. Cherkassky and F. Mulier, Learning from Data—Concepts, Theory, and Methods. New York: John Wiley&Sons, 1998.
[3] D.I. Clark and M.R. Osborne, “Finite Algorithms for Huber's M-Estimator,” SIAM J. Scientific and Statistical Computing, vol. 7, pp. 72-85, 1986.
[4] N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines. Cambridge, U.K.: Cambridge Univ. Press, 2000.
[5] “Delve Data for Evaluating Learning in Valid Experiments,”
[6] H. Ekblom, “A New Algorithm for the Huber Estimator in Linear Models,” BIT, vol. 28, pp. 123-132, 1988.
[7] D. Gale, The Theory of Linear Economic Models. New York: McGraw-Hill, 1960.
[8] P.W. Holland and R.E. Welsch, “Robust Regression Using Iteratively Reweighted Least Squares,” Comm. Statistics—Theory and Methods, vol. A6, pp. 813-827, 1977.
[9] P.J. Huber, Robust Statistics. New York: John Wiley, 1981.
[10] P.J. Huber and R. Dutter, “Numerical Solution of Robust Regression Problems,” Proc. Symp. Computational Statistics, G. Brushmann, ed., pp. 165-172, 1974.
[11] ILOG CPLEX 6. 5 Reference Manual, ILOG CPLEX Division, Incline Village, Nev., 1999
[12] W. Li, “Numerical Estimates for the Huber M-Estimator Problem,” Approximation Theory VIII, C.K. Chui and L.L. Schumaker, eds., pp. 325-334, New York: World Scientific Publishing, 1995.
[13] W. Li and J.J. Swetits, “The Linear$\ell_1$Estimator and the Huber M-Estimator,” SIAM J. Optimization, vol. 8, pp. 457-475, 1998.
[14] K. Madsen and H.B. Nielsen, “Finite Algorithms for Robust Linear Regression,” BIT, vol. 30, pp. 682-699, 1990.
[15] O.L. Mangasarian, Nonlinear Programming. Philadelphia: SIAM, 1994.
[16] O.L. Mangasarian, “Generalized Support Vector Machines,” Advances in Large Margin Classifiers, A. Smola, P. Bartlett, B. Schölkopf, and D. Schuurmans, eds., pp. 135-146, Cambridge, Mass: MIT Press, 2000.
[17] O.L. Mangasarian and R.R. Meyer, “Nonlinear Perturbation of Linear Programs,” SIAM J. Control and Optimization, vol. 17, no. 6, pp. 745-752, Nov. 1979.
[18] O.L. Mangasarian and D.R. Musicant, “Data Discrimination via Nonlinear Generalized Support Vector Machines,” Technical Report 99-03, Computer Sciences Dept., Univ. Wisconsin, Madison, Mar. 1999. To appear in: Applications and Algorithms of Complementarity, M.C. Ferris, O.L. Mangasarian, and J.-S. Pang, eds., Boston: Kluwer Academic Publishers, 2000.
[19] O.L. Mangasarian and D.R. Musicant, “Massive Support Vector Regression,” Technical Report 99-02, Data Mining Institute, Computer Sciences Dept., Univ. Wisconsin, Madison, July 1999.
[20] O.L. Mangasarian and T.-H. Shiau, “Lipschitz Continuity of Solutions of Linear Inequalities, Programs and Complementarity Problems,” SIAM J. Control and Optimization, vol. 25, no. 3, pp. 583-595, May 1987.
[21] “MATLAB,” User's Guide, Natick, Mass.: The MathWorks, Inc., 1992.
[22] “MATLAB,” Application Program Interface Guide, Natick, Mass.: The MathWorks, Inc., 1997.
[23] C. Michelot and M.L. Bougeard, “Duality Results and Proximal Solutions of the Huber M-Estimator Problem,” Applied Math. and Optimization, vol. 30, pp. 203-221, 1994.
[24] P.M. Murphy and D.W. Aha, “UCI Repository of Machine Learning Databases,” 1992. .
[25] B.T. Polyak, Introduction to Optimization, Optimization Software, Inc., New York: Publications Division, 1987.
[26] B. Schölkopf, P. Bartlett, A. Smola, and R. Williamson, “Support Vector Regression with Automatic Accuracy Control,” Proc. Eighth Int'l Conf. Artificial Neural Networks, L. Niklasson, M. Boden, and T. Ziemke, eds., pp. 111-116, 1998. http:/
[27] B. Schölkopf, P. Bartlett, A. Smola, and R. Williamson, “Shrinking the Tube: A New Support Vector Regression Algorithm,” technical report, GMD FIRST, Berlin, Germany, 1999. http:/
[28] D.F. Shanno and D.M. Rocke, “Numerical Methods for Robust Regression: Linear Models.” SIAM J. Scientific and Statistical Computing, vol. 7, pp. 86-97, 1986.
[29] A. Smola, “Regression Estimation with Support Vector Learning Machines,” master's thesis, Technische Universität München, München, Germany, 1996.
[30] A. Smola, B. Schölkopf, and G. Rätsch, “Linear Programs for Automatic Accuracy Control in Regression,” technical report, GMD FIRST, Berlin, Germany, 1999. http:/
[31] A.J. Smola, “Learning with Kernels,” PhD thesis, Technische Universität Berlin, Germany, 1998.
[32] W.N. Street and O.L. Mangasarian, “Improved Generalization via Tolerant Training,” J. Optimization Theory and Applications, vol. 96, no. 2, pp. 259-279, Feb. 1998. .
[33] V.N. Vapnik, Statistical Learning Theory, John Wiley&Sons, 1998.
[34] S.J. Wright, “Using Complementarity and Optimization Methods in Statistics,” Proc. Int'l Conf. Complementarity Problems, June 1999.
[35] S.J. Wright, “On Reduced Convex QP Formulations of Monotone LCP Problems,” Technical Report ANL/MCS-P808-0400, Argonne Nat'l Laboratory, Apr. 2000.

Index Terms:
Support vector machines, regression, Huber M-estimator, kernel methods.
Olvi L. Mangasarian, David R. Musicant, "Robust Linear and Support Vector Regression," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 9, pp. 950-955, Sept. 2000, doi:10.1109/34.877518
Usage of this product signifies your acceptance of the Terms of Use.