Title :
Leave-one-out Bounds for Support Vector Regression
Author :
Tian, Yingjie ; Deng, Naiyang
Author_Institution :
Coll. of Sci., China Agric. Univ.
Abstract :
The success of support vector machine (SVM) depends critically on the kernel and the parameters in it. One of the most reasonable approaches is to select the kernel and the parameters by minimizing the bound of leave-one-out (Loo) error. However, the computation of the Loo error is extremely time consuming. Therefore, an efficient strategy is to minimize an upper bound of the Loo error, instead of the error itself. In fact, for support vector classification (SVC), some famous bounds have been proposed. This paper is concerned with support vector regression (SVR). We derive two Loo bounds for two algorithms of SVR. In order to show the validity, preliminary experiments are also presented
Keywords :
learning (artificial intelligence); pattern classification; regression analysis; support vector machines; leave-one-out error bound; support vector classification; support vector machine; support vector regression; Computational intelligence; Computational modeling; Educational institutions; Kernel; Machine learning; Robustness; Static VAr compensators; Support vector machine classification; Support vector machines; Upper bound;
Conference_Titel :
Computational Intelligence for Modelling, Control and Automation, 2005 and International Conference on Intelligent Agents, Web Technologies and Internet Commerce, International Conference on
Conference_Location :
Vienna
Print_ISBN :
0-7695-2504-0
DOI :
10.1109/CIMCA.2005.1631610