DocumentCode :
1942095
Title :
Evaluation of Performance Measures for SVR Hyperparameter Selection
Author :
Smets, Koen ; Verdonk, Brigitte ; Jordaan, Elsa M.
Author_Institution :
Univ. of Antwerp, Antwerp
fYear :
2007
fDate :
12-17 Aug. 2007
Firstpage :
637
Lastpage :
642
Abstract :
To obtain accurate modeling results, it is of primal importance to find optimal values for the hyperparameters in the Support Vector Regression (SVR) model. In general, we search for those parameters that minimize an estimate of the generalization error. In this study, we empirically investigate different performance measures found in the literature: k-fold cross-validation, the computationally intensive, but almost unbiased leave-one-out error, its upper bounds -radius/margin and span bound -, Vapnik´s measure, which uses an estimate of the VC dimension, and the regularized risk functional itself. For each of the estimates we focus on accuracy, complexity and the presence of local minima. The latter significantly influences the applicability of gradient-based search techniques to determine the optimal parameters.
Keywords :
estimation theory; generalisation (artificial intelligence); gradient methods; regression analysis; search problems; support vector machines; Vapnik measure; generalization error estimation; gradient-based search technique; k-fold cross-validation; leave-one-out error; performance measure; support vector regression hyperparameter selection; Genetic algorithms; Hilbert space; Kernel; Lagrangian functions; Mathematics; Neural networks; Optimization methods; Support vector machines; Upper bound; Virtual colonoscopy;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2007. IJCNN 2007. International Joint Conference on
Conference_Location :
Orlando, FL
ISSN :
1098-7576
Print_ISBN :
978-1-4244-1379-9
Electronic_ISBN :
1098-7576
Type :
conf
DOI :
10.1109/IJCNN.2007.4371031
Filename :
4371031
Link To Document :
بازگشت