Title of article
Empirical risk minimization for support vector classifiers
Author/Authors
A.R.، Figueiras-Vidal, نويسنده , , Artes-Rodriguez، A نويسنده , , F.، Perez-Cruz, نويسنده , , A.، Navia-Vazquez, نويسنده ,
Issue Information
روزنامه با شماره پیاپی سال 2003
Pages
-295
From page
296
To page
0
Abstract
In this paper, we propose a general technique for solving support vector classifiers (SVCs) for an arbitrary loss function, relying on the application of an iterative reweighted least squares (IRWLS) procedure. We further show that three properties of the SVC solution can be written as conditions over the loss function. This technique allows the implementation of the empirical risk minimization (ERM) inductive principle on large margin classifiers obtaining, at the same time, very compact (in terms of number of support vectors) solutions. The improvements obtained by changing the SVC loss function are illustrated with synthetic and real data examples.
Keywords
Learning capability , Storage capacity , two-hidden-layer feedforward networks (TLFNs) , neural-network modularity
Journal title
IEEE TRANSACTIONS ON NEURAL NETWORKS
Serial Year
2003
Journal title
IEEE TRANSACTIONS ON NEURAL NETWORKS
Record number
62811
Link To Document