The Community for Technology Leaders
Green Image
A new smoothing strategy for solving \epsilon{\hbox{-}}{\rm{support}} vector regression (\epsilon{\hbox{-}}{\rm{SVR}}), tolerating a small error in fitting a given data set linearly or nonlinearly, is proposed in this paper. Conventionally, \epsilon{\hbox{-}}{\rm{SVR}} is formulated as a constrained minimization problem, namely, a convex quadratic programming problem. We apply the smoothing techniques that have been used for solving the support vector machine for classification, to replace the \epsilon{\hbox{-}}{\rm{insensitive}} loss function by an accurate smooth approximation. This will allow us to solve \epsilon{\hbox{-}}{\rm{SVR}} as an unconstrained minimization problem directly. We term this reformulated problem as \epsilon{\hbox{-}}{\rm{smooth}} support vector regression (\epsilon{\hbox{-}}{\rm{SSVR}}). We also prescribe a Newton-Armijo algorithm that has been shown to be convergent globally and quadratically to solve our \epsilon{\hbox{-}}{\rm{SSVR}}. In order to handle the case of nonlinear regression with a massive data set, we also introduce the reduced kernel technique in this paper to avoid the computational difficulties in dealing with a huge and fully dense kernel matrix. Numerical results and comparisons are given to demonstrate the effectiveness and speed of the algorithm.
\epsilon{\hbox{-}}{\rm{insensitive}} loss function, \epsilon{\hbox{-}}{\rm{smooth}} support vector regression, kernel method, Newton-Armijo algorithm, support vector machine.

W. Hsieh, Y. Lee and C. Huang, "epsilon-SSVR: A Smooth Support Vector Machine for epsilon-Insensitive Regression," in IEEE Transactions on Knowledge & Data Engineering, vol. 17, no. , pp. 678-685, 2005.
79 ms
(Ver 3.3 (11022016))