DocumentCode :
2709169
Title :
Subspace based least squares support vector machines for pattern classification
Author :
Kitamura, Takuya ; Abe, Shigeo ; Fukui, Kazuhiro
Author_Institution :
Grad. Sch. of Eng., Kobe Univ., Kobe, Japan
fYear :
2009
fDate :
14-19 June 2009
Firstpage :
1640
Lastpage :
1646
Abstract :
In this paper, we discuss subspace based least squares support vector machines (SSLS-SVMs), in which an input vector is classified into the class with the maximum similarity. Namely, we define the similarity measure for each class by the weighted sum of vectors called dictionaries and optimize the weights so that the margin between classes is optimized. Because the similarity measure is defined for each class, the similarity measure associated with a data sample needs to be the largest among all the similarity measures. Introducing slack variables we define these constraints by equality constraints. Then the proposed SSLS-SVMs is similar to LS-SVMs by all-at-once formulation. Because all-at-once formulation is inefficient, we also propose SSLS-SVMs by one-against-all formulation. We demonstrate the effectiveness of the proposed methods with the conventional method for two-class problems.
Keywords :
least squares approximations; pattern classification; support vector machines; data sample; dictionary; equality constraint; pattern classification; similarity measure; subspace based least square support vector machine; Dictionaries; Eigenvalues and eigenfunctions; Kernel; Least squares methods; Neural networks; Pattern classification; Principal component analysis; Support vector machine classification; Support vector machines; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2009. IJCNN 2009. International Joint Conference on
Conference_Location :
Atlanta, GA
ISSN :
1098-7576
Print_ISBN :
978-1-4244-3548-7
Electronic_ISBN :
1098-7576
Type :
conf
DOI :
10.1109/IJCNN.2009.5178763
Filename :
5178763
Link To Document :
بازگشت