DocumentCode :
1946364
Title :
Leave-one-out Bounds for Support Vector Regression
Author :
Tian, Yingjie ; Deng, Naiyang
Author_Institution :
Coll. of Sci., China Agric. Univ.
Volume :
2
fYear :
2005
fDate :
28-30 Nov. 2005
Firstpage :
1061
Lastpage :
1066
Abstract :
The success of support vector machine (SVM) depends critically on the kernel and the parameters in it. One of the most reasonable approaches is to select the kernel and the parameters by minimizing the bound of leave-one-out (Loo) error. However, the computation of the Loo error is extremely time consuming. Therefore, an efficient strategy is to minimize an upper bound of the Loo error, instead of the error itself. In fact, for support vector classification (SVC), some famous bounds have been proposed. This paper is concerned with support vector regression (SVR). We derive two Loo bounds for two algorithms of SVR. In order to show the validity, preliminary experiments are also presented
Keywords :
learning (artificial intelligence); pattern classification; regression analysis; support vector machines; leave-one-out error bound; support vector classification; support vector machine; support vector regression; Computational intelligence; Computational modeling; Educational institutions; Kernel; Machine learning; Robustness; Static VAr compensators; Support vector machine classification; Support vector machines; Upper bound;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computational Intelligence for Modelling, Control and Automation, 2005 and International Conference on Intelligent Agents, Web Technologies and Internet Commerce, International Conference on
Conference_Location :
Vienna
Print_ISBN :
0-7695-2504-0
Type :
conf
DOI :
10.1109/CIMCA.2005.1631610
Filename :
1631610
Link To Document :
بازگشت