The Community for Technology Leaders
Pacific-Asia Workshop on Computational Intelligence and Industrial Application, IEEE (2008)
Dec. 19, 2008 to Dec. 20, 2008
ISBN: 978-0-7695-3490-9
pp: 287-291
ABSTRACT
When applying traditional methods to train approximately linear support vector machine (SVM), we will get a kernel matrix which occupy mass computer memory and lead a slow convergence speed. In order to improve the convergence speed of SVM, a method of training approximately linear support vector machine based on variational inequality (VIALSVM) was proposed. The method turns the convex quadratic programming into variational inequality in the process of training approximately linear support vector machine. The method can find the optimal separating hyperplane by solving variational inequality, and can not produce a great deal of data that occupy the computer memory. So the method can improve the training and test speed of SVM in classification greatly. We apply VIALSVM into the multidimensional iris training samples. Experiments show that VIALSVM not only has less misclassification rate, but also has more fast convergence speed than traditional SVM on the foundation of the equal misclassification, and especially in the high dimension training samples.
INDEX TERMS
support vector machine, variational inequality, approximately linear, separating hyperplane
CITATION
Haiyan Xie, Depeng Zhao, Fengying Miao, "A Kind of Approximately Linear Support Vector Machine Based on Variational Inequality", Pacific-Asia Workshop on Computational Intelligence and Industrial Application, IEEE, vol. 01, no. , pp. 287-291, 2008, doi:10.1109/PACIIA.2008.294
96 ms
(Ver 3.3 (11022016))