Title :
Reducing the Number of Support Vectors of SVM Classifiers Using the Smoothed Separable Case Approximation
Author :
Geebelen, D. ; Suykens, J.A.K. ; Vandewalle, J.
Author_Institution :
Dept. of Electr. Eng., Katholieke Univ. Leuven, Leuven, Belgium
fDate :
4/1/2012 12:00:00 AM
Abstract :
In this brief, we propose a new method to reduce the number of support vectors of support vector machine (SVM) classifiers. We formulate the approximation of an SVM solution as a classification problem that is separable in the feature space. Due to the separability, the hard-margin SVM can be used to solve it. This approach, which we call the separable case approximation (SCA), is very similar to the cross-training algorithm explained in , which is inspired by editing algorithms . The norm of the weight vector achieved by SCA can, however, become arbitrarily large. For that reason, we propose an algorithm, called the smoothed SCA (SSCA), that additionally upper-bounds the weight vector of the pruned solution and, for the commonly used kernels, reduces the number of support vectors even more. The lower the chosen upper bound, the larger this extra reduction becomes. Upper-bounding the weight vector is important because it ensures numerical stability, reduces the time to find the pruned solution, and avoids overfitting during the approximation phase. On the examined datasets, SSCA drastically reduces the number of support vectors.
Keywords :
approximation theory; pattern classification; support vector machines; SVM classifier; cross-training algorithm; editing algorithm; hard-margin SVM; smoothed separable case approximation; support vector machines; weight vector; Accuracy; Approximation algorithms; Approximation methods; Kernel; Support vector machines; Training; Vectors; Run time complexity; sparse approximation; support vector machine classifier;
Journal_Title :
Neural Networks and Learning Systems, IEEE Transactions on
DOI :
10.1109/TNNLS.2012.2186314