DocumentCode :
3431960
Title :
SVM vs regularized least squares classification
Author :
Zhang, Peng ; Peng, Jing
Author_Institution :
Dept. of Electr. Eng. & Comput. Eng., Tulane Univ., New Orleans, LA, USA
Volume :
1
fYear :
2004
fDate :
23-26 Aug. 2004
Firstpage :
176
Abstract :
Support vector machines (SVMs) and regularized least squares (RLS) are two recent promising techniques for classification. SVMs implement the structure risk minimization principle and use the kernel trick to extend it to the nonlinear case. On the one hand, RLS minimizes a regularized functional directly in a reproducing kernel Hilbert space defined by a kernel. While both have a sound mathematical foundation, RLS is strikingly simple. On the other hand, SVMs in general have a sparse representation of solutions. In addition, the performance of SVMs has been well documented but little can be said of RLS. This paper applies these two techniques to a collection of data sets and presents results demonstrating virtual identical performance by the two methods.
Keywords :
Hilbert spaces; least squares approximations; minimisation; pattern classification; support vector machines; SVM; data set collection; kernel Hilbert space; mathematical foundation; regularized least squares classification; risk minimization principle; support vector machines; virtual identical performance; Cancer; Hilbert space; Kernel; Least squares methods; Object recognition; Resonance light scattering; Risk management; Support vector machine classification; Support vector machines; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Pattern Recognition, 2004. ICPR 2004. Proceedings of the 17th International Conference on
ISSN :
1051-4651
Print_ISBN :
0-7695-2128-2
Type :
conf
DOI :
10.1109/ICPR.2004.1334050
Filename :
1334050
Link To Document :
بازگشت