CSDL Home IEEE Transactions on Pattern Analysis & Machine Intelligence 2010 vol.32 Issue No.08  August
Subscribe
Issue No.08  August (2010 vol.32)
pp: 15221528
Tobias Glasmachers , Dalle Molle Institute for Artificial Intelligence (IDSIA), MannoLugano
Christian Igel , RuhrUniversität Bochum, Bochum
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TPAMI.2010.95
ABSTRACT
Adapting the hyperparameters of support vector machines (SVMs) is a challenging model selection problem, especially when flexible kernels are to be adapted and data are scarce. We present a coherent framework for regularized model selection of 1norm soft margin SVMs for binary classification. It is proposed to use gradientascent on a likelihood function of the hyperparameters. The likelihood function is based on logistic regression for robustly estimating the class conditional probabilities and can be computed efficiently. Overfitting is an important issue in SVM model selection and can be addressed in our framework by incorporating suitable prior distributions over the hyperparameters. We show empirically that gradientbased optimization of the likelihood function is able to adapt multiple kernel parameters and leads to better models than four concurrent stateoftheart methods.
INDEX TERMS
Support vector machines, model selection, regularization, maximum likelihood.
CITATION
Tobias Glasmachers, Christian Igel, "Maximum Likelihood Model Selection for 1Norm Soft Margin SVMs with Multiple Parameters", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.32, no. 8, pp. 15221528, August 2010, doi:10.1109/TPAMI.2010.95
REFERENCES
