Issue No. 04 - April (2010 vol. 22)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TKDE.2009.116
Haibin Cheng , Yahoo! Labs, Santa Clara
Pang-Ning Tan , Michigan State University, East Lansing
Rong Jin , Michigan State University, East Lansing
This paper presents a framework called Localized Support Vector Machine (LSVM) for classifying data with nonlinear decision surfaces. Instead of building a sophisticated global model from the training data, LSVM constructs multiple linear SVMs, each of which is designed to accurately classify a given test example. A major limitation of this framework is its high computational cost since a unique model must be constructed for each test example. To overcome this limitation, we propose an efficient implementation of LSVM, termed Profile SVM (PSVM). PSVM partitions the training examples into clusters and builds a separate linear SVM model for each cluster. Our empirical results show that 1) LSVM and PSVM outperform nonlinear SVM for all 20 of the evaluated data sets and 2) PSVM achieves comparable performance as LSVM in terms of model accuracy but with significant computational savings. We also demonstrate the efficacy of the proposed approaches in terms of classifying data with spatial and temporal dependencies.
Classification, support vector machine, kernel-based learning, local learning.
H. Cheng, P. Tan and R. Jin, "Efficient Algorithm for Localized Support Vector Machine," in IEEE Transactions on Knowledge & Data Engineering, vol. 22, no. , pp. 537-549, 2009.