DocumentCode :
2223694
Title :
A geometric approach to train support vector machines
Author :
Yang, Ming-Hsuan ; Ahuja, Narendra
Author_Institution :
Dept. of Comput. Sci., Illinois Univ., Urbana, IL, USA
Volume :
1
fYear :
2000
fDate :
2000
Firstpage :
430
Abstract :
Support Vector Machines (SVMs) have shown great potential in numerous visual learning and pattern recognition problems. The optimal decision surface of a SVM is constructed from its support vectors which are conventionally determined by solving a quadratic programming (QP) problem. However, solving a large optimization problem is challenging since it is computationally intensive and the memory requirement grows with square of the training vectors. In this paper, we propose a geometric method to extract a small superset of support vectors, which we call guard vectors, to construct the optimal decision surface. Specifically, the guard vectors are found by solving a set of linear programming problems. Experimental results on synthetic and real data sets show that the proposed method is more efficient than conventional methods using QPs and requires much less memory
Keywords :
computer vision; linear programming; optimisation; pattern recognition; quadratic programming; geometric approach; linear programming; optimal decision surface; optimization problem; pattern recognition; quadratic programming; support vector machines; visual learning; Support vector machines;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computer Vision and Pattern Recognition, 2000. Proceedings. IEEE Conference on
Conference_Location :
Hilton Head Island, SC
ISSN :
1063-6919
Print_ISBN :
0-7695-0662-3
Type :
conf
DOI :
10.1109/CVPR.2000.855851
Filename :
855851
Link To Document :
بازگشت