DocumentCode :
1202250
Title :
Pruning error minimization in least squares support vector machines
Author :
De Kruif, Bas J. ; De Vries, Theo J A
Author_Institution :
Drebbel Inst. for Mechatronics, Univ. of Twente, Enschede, Netherlands
Volume :
14
Issue :
3
fYear :
2003
fDate :
5/1/2003 12:00:00 AM
Firstpage :
696
Lastpage :
702
Abstract :
The support vector machine (SVM) is a method for classification and for function approximation. This method commonly makes use of an ε-insensitive cost function, meaning that errors smaller than ε remain unpunished. As an alternative, a least squares support vector machine (LSSVM) uses a quadratic cost function. When the LSSVM method is used for function approximation, a nonsparse solution is obtained. The sparseness is imposed by pruning, i.e., recursively solving the approximation problem and subsequently omitting data that has a small error in the previous pass. However, omitting data with a small approximation error in the previous pass does not reliably predict what the error will be after the sample has been omitted. In this paper, a procedure is introduced that selects from a data set the training sample that will introduce the smallest approximation error when it will be omitted. It is shown that this pruning scheme outperforms the standard one.
Keywords :
function approximation; learning (artificial intelligence); learning automata; least squares approximations; minimisation; classification; data set; error minimization; function approximation; insensitive cost function; least squares support vector machines; pruning; quadratic cost function; Approximation error; Cost function; Equations; Error correction; Function approximation; Helium; Least squares approximation; Least squares methods; Support vector machine classification; Support vector machines;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2003.810597
Filename :
1199664
Link To Document :
بازگشت