Title :
Faster optimization of SVR hyperparameters based on minimizing cross-validation error
Author :
Kobayashi, K. ; Nakano, R.
Author_Institution :
Nagoya Inst. of Technol.
Abstract :
The performance of support vector (SV) regression deeply depends on its hyperparameters such as the thickness of an insensitive zone, a penalty factor, kernel function parameters and so on. A method called MCV-SVR was recently proposed, which optimizes SVR hyperparameters lambda so that a cross-validation error is minimized. This paper proposes a faster version of the MCV-SVR. The MCV-SVR method iterates two basic steps until convergence; step 1 optimizes parameters thetas under given lambda, while step 2 improves lambda under given thetas. The present paper accelerates step 2 by effectively reducing the number of samples for evaluation. Our experiments using two data sets show that the CPU time for step 2 was reduced by more than one degree of magnitude and the total CPU time was reduced by half or more, while the generalization performance remained comparable
Keywords :
minimisation; regression analysis; support vector machines; MCV-SVR; SVR hyperparameter optimization; cross-validation error minimization; support vector regression; Acceleration; Convergence; Kernel; Lagrangian functions; Neural networks; Optimization methods; Quadratic programming; Training data; Vectors;
Conference_Titel :
Cybernetics and Intelligent Systems, 2004 IEEE Conference on
Conference_Location :
Singapore
Print_ISBN :
0-7803-8643-4
DOI :
10.1109/ICCIS.2004.1460729