The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.05 - May (2013 vol.25)
pp: 1186-1190
Nicholas A. Arnosti , Stanford University, Palo Alto
Jugal K. Kalita , University of Colorado, Colorado Springs
ABSTRACT
Support Vector Machines (SVMs) have been shown to achieve high performance on classification tasks across many domains, and a great deal of work has been dedicated to developing computationally efficient training algorithms for linear SVMs. One approach [1] approximately minimizes risk through use of cutting planes, and is improved by [2], [3]. We build upon this work, presenting a modification to the algorithm developed by Franc and Sonnenburg [2]. We demonstrate empirically that our changes can reduce cutting plane training time by up to 40 percent, and discuss how changes in data sets and parameter settings affect the effectiveness of our method.
INDEX TERMS
Training, Support vector machines, Vectors, Equations, Approximation algorithms, Convergence, Linear approximation, cutting plane SVM, Linear support vector machine
CITATION
Nicholas A. Arnosti, Jugal K. Kalita, "Cutting Plane Training for Linear Support Vector Machines", IEEE Transactions on Knowledge & Data Engineering, vol.25, no. 5, pp. 1186-1190, May 2013, doi:10.1109/TKDE.2011.247
REFERENCES
[1] T. Joachims, "Training Linear SVMs in Linear Time," Proc. ACM SIGKDD Int'l Conf. Knowledge Discovery and Data Mining, pp. 217-226, 2006.
[2] V. Franc and S. Sonnenburg, "Optimized Cutting Plane Algorithm for Support Vector Machines," Proc. Int'l Conf. Machine Learning (ICML), pp. 320-327, 2008.
[3] V. Franc and S. Sonnenburg, "Optimized Cutting Plane Algorithm for Large-Scale Risk Minimization," J. Machine Learning Research, vol. 10, pp. 2157-2192, 2009.
[4] E. Mayoraz and E. Alpaydin, "Support Vector Machines for Multi-Class Classification," Engineering Applications of Bio-Inspired Artificial Neural Networks, pp. 833-842, Springer, 1999.
[5] C. Burges, "A Tutorial on Support Vector Machines for Pattern Recognition," Data Mining and Knowledge Discovery, vol. 2, no. 2, pp. 121-167, 1998.
[6] C. Hsu and C. Lin, "A Comparison of Methods for Multiclass Support Vector Machines," IEEE Trans. Neural Networks, vol. 13, no. 2, pp. 415-425, Mar. 2002.
[7] T. Dietterich and G. Bakiri, "Solving Multiclass Learning Problems via Error-Correcting Output Codes," J. Artificial Intelligence Research, vol. 2, pp. 263-286, 1995.
[8] J. Weston and C. Watkins, "Support Vector Machines for Multi-Class Pattern Recognition," Proc. Seventh European Symp. Artificial Neural Networks, vol. 4, no. 6, pp. 219-224, 1999.
[9] K. Crammer and Y. Singer, "On the Algorithmic Implementation of Multiclass Kernel-Based Vector Machines," J. Machine Learning Research, vol. 2, pp. 265-292, 2002.
[10] C.H. Teo, A. Smola, S.V. Vishwanathan, and Q.V. Le, "A Scalable Modular Convex Solver for Regularized Risk Minimization," Proc. ACM SIGKDD Int'l Conf. Knowledge Discovery and Data Mining, pp. 727-736, 2007.
[11] S. Keerthi, S. Sundararajan, K. Chang, C. Hsieh, and C. Lin, "A Sequential Dual Method for Large Scale Multi-Class Linear SVMs," Proc. ACM SIGKDD Int'l Conf. Knowledge Discovery and Data Mining, pp. 408-416, 2008.
[12] S. Shalev-Shwartz and N. Srebro, "Svm Optimization: Inverse Dependence on Training Set Size," Proc. Int'l Conf. Machine Learning (ICML), pp. 928-935, 2008.
[13] T. Joachims, T. Finley, and C. Yu, "Cutting-Plane Training of Structural SVMs," Machine Learning, vol. 77, no. 1, pp. 27-59, 2009.
[14] A. Bordes, L. Bottou, and P. Gallinari, "Sgd-qn: Careful Quasi-Newton Stochastic Gradient Descent," J. Machine Learning Research, vol. 10, pp. 1737-1754, Dec. 2009.
[15] C. Lin, C. Chang, and C. Hsu, "A Practical Guide to Support Vector Classification," technical report, Nat'l Taiwan Univ., www.csie. ntu.edu.tw/~cjlin/papers/guide guide.pdf, 2003.
[16] H. Lei and V. Govindaraju, "Half-against-Half Multi-Class Support Vector Machines," Proc. Int'l Workshop Multiple Classifier Systems, pp. 156-164, 2005.
19 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool