The Community for Technology Leaders
Green Image
Issue No. 08 - August (2010 vol. 32)
ISSN: 0162-8828
pp: 1522-1528
Christian Igel , Ruhr-Universität Bochum, Bochum
Tobias Glasmachers , Dalle Molle Institute for Artificial Intelligence (IDSIA), Manno-Lugano
Adapting the hyperparameters of support vector machines (SVMs) is a challenging model selection problem, especially when flexible kernels are to be adapted and data are scarce. We present a coherent framework for regularized model selection of 1-norm soft margin SVMs for binary classification. It is proposed to use gradient-ascent on a likelihood function of the hyperparameters. The likelihood function is based on logistic regression for robustly estimating the class conditional probabilities and can be computed efficiently. Overfitting is an important issue in SVM model selection and can be addressed in our framework by incorporating suitable prior distributions over the hyperparameters. We show empirically that gradient-based optimization of the likelihood function is able to adapt multiple kernel parameters and leads to better models than four concurrent state-of-the-art methods.
Support vector machines, model selection, regularization, maximum likelihood.
Christian Igel, Tobias Glasmachers, "Maximum Likelihood Model Selection for 1-Norm Soft Margin SVMs with Multiple Parameters", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 32, no. , pp. 1522-1528, August 2010, doi:10.1109/TPAMI.2010.95
98 ms
(Ver 3.1 (10032016))