
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
YuhJye Lee, WenFeng Hsieh, ChienMing Huang, "epsilonSSVR: A Smooth Support Vector Machine for epsilonInsensitive Regression," IEEE Transactions on Knowledge and Data Engineering, vol. 17, no. 5, pp. 678685, May, 2005.  
BibTex  x  
@article{ 10.1109/TKDE.2005.77, author = {YuhJye Lee and WenFeng Hsieh and ChienMing Huang}, title = {epsilonSSVR: A Smooth Support Vector Machine for epsilonInsensitive Regression}, journal ={IEEE Transactions on Knowledge and Data Engineering}, volume = {17}, number = {5}, issn = {10414347}, year = {2005}, pages = {678685}, doi = {http://doi.ieeecomputersociety.org/10.1109/TKDE.2005.77}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Knowledge and Data Engineering TI  epsilonSSVR: A Smooth Support Vector Machine for epsilonInsensitive Regression IS  5 SN  10414347 SP678 EP685 EPD  678685 A1  YuhJye Lee, A1  WenFeng Hsieh, A1  ChienMing Huang, PY  2005 KW  \epsilon{\hbox{}}{\rm{insensitive}} loss function KW  \epsilon{\hbox{}}{\rm{smooth}} support vector regression KW  kernel method KW  NewtonArmijo algorithm KW  support vector machine. VL  17 JA  IEEE Transactions on Knowledge and Data Engineering ER   
[1] D.P. Bertsekas, Nonlinear Programming. Belmont, Mass.: Athena Scientific, 1995.
[2] C.J.C. Burges, “A Tutorial on Support Vector Machines for Pattern Recognition,” Data Mining and Knowledge Discovery, vol. 2, no. 2, pp. 121167, 1998.
[3] C.C. Chang and C.J. Lin, LIBSVM: A Library for Support Vector Machines, 2001, software available at http://www.csie.ntu. edu.tw/~cjlinlibsvm .
[4] B. Chen, and P.T. Harker, “Smooth Approximations to Nonlinear Complementarity Problems,” SIAM J. Optimization, vol. 7, pp. 403420, 1997.
[5] C. Chen and O.L. Mangasarian, “Smoothing Methods for Convex Inequalities and Linear Complementarity Problems,” Math. Programming, vol. 71, no. 1, pp. 5169, 1995.
[6] C. Chen and O.L. Mangasarian, “A Class of Smoothing Functions for Nonlinear and Mixed Complementarity Problems,” Computational Optimization and Applications, vol. 5, no. 2, pp. 97138, 1996.
[7] X. Chen, L. Qi, and D. Sun, “Global and Superlinear Convergence of the Smoothing Newton Method and Its Application to General Box Constrained Variational Inequalities,” Math. of Computation, vol. 67, pp. 519540, 1998.
[8] X. Chen and Y. Ye, “On HomotopySmoothing Methods for Variational Inequalities,” SIAM J. Control and Optimization, vol. 37, pp. 589616, 1999.
[9] P.W. Christensen and J.S. Pang, “Frictional Contact Algorithms Based on Semismooth Newton Methods,” Reformulation: Nonsmooth, Piecewise Smooth, Semismooth and Smoothing Methods, M. Fukushima and L. Qi, eds., pp. 81116, Kluwer Academic Publishers, 1999.
[10] N. Cristianini and J. ShaweTaylor, An Introduction to Support Vector Machines. Cambridge: Cambridge Univ. Press, 2000.
[11] DELVE, Data for Evaluating Learning in Valid Experiments, CompActiv Dataset, http://www.cs.toronto.edu/~delve/data/compactiv desc.html, 2005.
[12] DELVE, Data for Evaluating Learning in Valid Experiments, Kinfamily Dataset, http://www.cs.toronto.edu/~delve/data/kin desc.html, 2005.
[13] J.E. Dennis and R.B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Englewood Cliffs, N.J.: PrenticeHall, 1983.
[14] H. Drucker, C.J.C. Burges, L. Kaufman, A. Smola, and V. Vapnik, “Support Vector Regression Machines,” Advances in Neural Information Processing Systems 9, M.C. Mozer, M.I. Jordan, and T. Petsche, eds., pp. 155161, Cambridge, Mass.: MIT Press, 1997.
[15] M. Fukushima and L. Qi, Reformulation: Nonsmooth, Piecewise Smooth, Semismooth, and Smoothing Methods. Dordrecht, The Netherlands: Kluwer Academic Publishers, 1999.
[16] G. Fung and O.L. Mangasarian, “Proximal Support Vector Machine Classifiers,” Proc. KDD2001: Knowledge Discovery and Data Mining, F. Provost and R. Srikant, eds., pp. 7786, 2001, ftp://ftp.cs.wisc.edu/pub/dmi/techreports 0102.ps.
[17] S.Y. Huang and Y.J. Lee, “Reduced Support Vector Machines: A Statistical Theory,” Preprint, Inst. of Statistical Science, Academia Sinica, 2004, http://www.stat.sinica.edu.twsyhuang/.
[18] T. Joachims, ${\rm{SVM}}^{light}$ , 2002, http:/svmlight.joachims.org.
[19] Y.J. Lee and O.L. Mangasarian, “RSVM: Reduced Support Vector Machines,” Technical Report 0007, Data Mining Inst., Computer Sciences Dept., Univ. of Wisconsin, Madison, Wisconsin, July 2000, also Proc. First SIAM Int'l Conf. Data Mining, 2001, ftp://ftp.cs.wisc.edu/pub/dmi/techreports 0007.ps.
[20] Y.J. Lee and O.L. Mangasarian, “SSVM: A Smooth Support Vector Machine,” Computational Optimization and Applications, vol. 20, pp. 522, 2001, also Data Mining Inst., Univ. of Wisconsin, Technical Report 9903, ftp://ftp.cs.wisc.edu/pub/dmi/techreports 9903.ps.
[21] K.M. Lin and C.J. Lin, “A Study on Reduced Support Vector Machines,” IEEE Trans. Neural Networks, vol. 14, no. 6, pp. 14491459, 2003.
[22] O.L. Mangasarian, “Generalized Support Vector Machines,” Advances in Large Margin Classifiers, A. Smola, P. Bartlett, B. Schölkopf, and D. Schuurmans, eds., pp. 135146, Cambridge, Mass: MIT Press, 2000, ftp://ftp.cs.wisc.edu/mathprog/techreports 9814.ps.
[23] O.L. Mangasarian and D.R. Musicant, “Robust Linear and Support Vector Regression,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 9, pp. 950955, 2000, .
[24] O.L. Mangasarian and D.R. Musicant, “Large Scale Kernel Regression via Linear Programming,” Machine Learning, vol. 46, pp. 255269, 2002, ftp://ftp.cs.wisc. edu/pub/dmi/techreports/ 9909.psftp://ftp.cs.wisc.edu/pub/dmi/ techreports9902.ps.
[25] MATLAB, User's Guide. Natick, Mass.: The MathWorks, Inc., 19942001, http:/www.mathworks.com.
[26] C.L. Blake and C.J. Merz UCI Repository of Machine Learning Databases, 1998, http://www.ics.uci.edu/~mlearnMLRepository.htm .
[27] D.R. Musicant, and A. Feinberg, “Active Set Support Vector Regression,” IEEE Trans. Neural Networks, vol. 15, no. 2, pp. 268275, 2004.
[28] M. Stone, “CrossValidatory Choice and Assessment of Statistical Predictions,” J. Royal Statistical Soc., vol. 36, pp. 111147, 1974.
[29] P. Tseng, “Analysis of a NonInterior Continuation Method Based on ChenMangasarian Smoothing Functions for Complementarity Problems,” Reformulation: Nonsmooth, Piecewise Smooth, Semismooth, and Smoothing Methods, M. Fukushima and L. Qi, eds., pp. 381404, Dordrecht, Netherlands: Kluwer Academic Publishers, 1999.
[30] V.N. Vapnik, The Nature of Statistical Learning Theory. New York: Springer, 1995.
[31] R.C. Whaley, A. Petitet, and J.J. Dongarra, “Automated Empirical Optimization of Software and the ATLAS Project,” Parallel Computing, vol. 27, nos. 12, pp. 335, 2001, also available as Univ. of Tennessee LAPACK Working Note #147, UTCS00448, www.netlib.org/lapack/lawnslawn147.ps, 2000.
[32] I.H. Witten and E. Frank, Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations. San Francisco: Morgan Kaufmann, 1999.