DocumentCode :
1465934
Title :
Maximum Likelihood Model Selection for 1-Norm Soft Margin SVMs with Multiple Parameters
Author :
Glasmachers, Tobias ; Ige, Christian
Author_Institution :
Dalle Molle Inst. for Artificial Intell. (IDSIA), Lugano, Switzerland
Volume :
32
Issue :
8
fYear :
2010
Firstpage :
1522
Lastpage :
1528
Abstract :
Adapting the hyperparameters of support vector machines (SVMs) is a challenging model selection problem, especially when flexible kernels are to be adapted and data are scarce. We present a coherent framework for regularized model selection of 1-norm soft margin SVMs for binary classification. It is proposed to use gradient-ascent on a likelihood function of the hyperparameters. The likelihood function is based on logistic regression for robustly estimating the class conditional probabilities and can be computed efficiently. Overfitting is an important issue in SVM model selection and can be addressed in our framework by incorporating suitable prior distributions over the hyperparameters. We show empirically that gradient-based optimization of the likelihood function is able to adapt multiple kernel parameters and leads to better models than four concurrent state-of-the-art methods.
Keywords :
optimisation; pattern classification; regression analysis; support vector machines; 1-norm soft margin SVM; binary classification; gradient ascent; gradient based optimization; logistic regression; maximum likelihood model selection; Bayesian methods; Kernel; Logistics; Maximum likelihood estimation; Optimization methods; Pattern recognition; Robustness; Statistical learning; Support vector machine classification; Support vector machines; Support vector machines; maximum likelihood.; model selection; regularization;
fLanguage :
English
Journal_Title :
Pattern Analysis and Machine Intelligence, IEEE Transactions on
Publisher :
ieee
ISSN :
0162-8828
Type :
jour
DOI :
10.1109/TPAMI.2010.95
Filename :
5444892
Link To Document :
بازگشت