DocumentCode
786809
Title
Training Hard-Margin Support Vector Machines Using Greedy Stagewise Algorithm
Author
Bo, Liefeng ; Wang, Ling ; Jiao, Licheng
Author_Institution
Key Lab. of Intell. Perception, Xidian Univ., Xi´´an
Volume
19
Issue
8
fYear
2008
Firstpage
1446
Lastpage
1455
Abstract
Hard-margin support vector machines (HM-SVMs) suffer from getting overfitting in the presence of noise. Soft-margin SVMs deal with this problem by introducing a regularization term and obtain a state-of-the-art performance. However, this disposal leads to a relatively high computational cost. In this paper, an alternative method, greedy stagewise algorithm for SVMs, named GS-SVMs, is presented to cope with the overfitting of HM-SVMs without employing the regularization term. The most attractive property of GS-SVMs is that its computational complexity in the worst case only scales quadratically with the size of training samples. Experiments on the large data sets with up to 400 000 training samples demonstrate that GS-SVMs can be faster than LIBSVM 2.83 without sacrificing the accuracy. Finally, we employ statistical learning theory to analyze the empirical results, which shows that the success of GS-SVMs lies in that its early stopping rule can act as an implicit regularization term.
Keywords
computational complexity; greedy algorithms; support vector machines; computational complexity; greedy stagewise algorithm; hard-margin support vector machine training; implicit regularization term; soft-margin SVM; training hard-margin support vector machines; Classification; Vapnik–Chervonenkis (VC) dimension; greedy stagewise algorithm; support vector machines (SVMs); Algorithms; Artificial Intelligence; Computer Simulation; Models, Theoretical; Neural Networks (Computer); Pattern Recognition, Automated;
fLanguage
English
Journal_Title
Neural Networks, IEEE Transactions on
Publisher
ieee
ISSN
1045-9227
Type
jour
DOI
10.1109/TNN.2008.2000576
Filename
4560240
Link To Document