DocumentCode :
1166328
Title :
Empirical risk minimization for support vector classifiers
Author :
Pérez-Cruz, Fernando ; Navia-Vázquez, Angel ; Figueiras-Vidal, Aníbal R. ; Artés-Rodríguez, Antonio
Author_Institution :
Dept. of Signal Theor. & Commun., Univ. Carlos de Madrid, Spain
Volume :
14
Issue :
2
fYear :
2003
fDate :
3/1/2003 12:00:00 AM
Firstpage :
296
Lastpage :
303
Abstract :
In this paper, we propose a general technique for solving support vector classifiers (SVCs) for an arbitrary loss function, relying on the application of an iterative reweighted least squares (IRWLS) procedure. We further show that three properties of the SVC solution can be written as conditions over the loss function. This technique allows the implementation of the empirical risk minimization (ERM) inductive principle on large margin classifiers obtaining, at the same time, very compact (in terms of number of support vectors) solutions. The improvements obtained by changing the SVC loss function are illustrated with synthetic and real data examples.
Keywords :
iterative methods; learning automata; least squares approximations; pattern classification; SVMs; arbitrary loss function; iterative reweighted least squares; pattern recognition; risk minimization; support vector classifiers; support vector machines; Extraterrestrial measurements; Least squares approximation; Least squares methods; Pattern recognition; Quadratic programming; Risk management; Static VAr compensators; Support vector machine classification; Support vector machines; Training data;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2003.809399
Filename :
1189628
Link To Document :
بازگشت