DocumentCode :
1680828
Title :
Benders decomposition technique for support vector regression
Author :
Trafalis, Theodore B. ; Ince, Huseyin
Author_Institution :
Sch. of Ind. Eng., Oklahoma Univ., Norman, OK, USA
Volume :
3
fYear :
2002
fDate :
6/24/1905 12:00:00 AM
Firstpage :
2767
Lastpage :
2772
Abstract :
The theory of the support vector machine (SVM) algorithm is based on the statistical learning theory. Training of SVMs leads to either a quadratic programming (QP) problem, or linear programming (LP) problem. This depends on the specific norm that is used when the distance between the convex hulls of two classes are computed. The l1 norm distance leads to a large scale linear programming problem in the case where the sample size is very large. We propose to apply the Benders decomposition technique to the resulting LP for the regression case. Preliminary results show that this technique is much faster than the QP formulation
Keywords :
learning (artificial intelligence); learning automata; linear programming; neural nets; quadratic programming; statistical analysis; Benders decomposition; convex hulls; linear programming; machine learning; quadratic programming; regression; statistical learning theory; support vector machines; Industrial engineering; Industrial training; Large-scale systems; Linear programming; Machine learning; Quadratic programming; Statistical learning; Support vector machine classification; Support vector machines; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2002. IJCNN '02. Proceedings of the 2002 International Joint Conference on
Conference_Location :
Honolulu, HI
ISSN :
1098-7576
Print_ISBN :
0-7803-7278-6
Type :
conf
DOI :
10.1109/IJCNN.2002.1007586
Filename :
1007586
Link To Document :
بازگشت