Title :
SMO-based pruning methods for sparse least squares support vector machines
Author :
Zeng, Xiangyan ; Chen, Xue-wen
Author_Institution :
Dept. of Electr. & Comput. Eng., California State Univ., Northridge, CA, USA
Abstract :
Solutions of least squares support vector machines (LS-SVMs) are typically nonsparse. The sparseness is imposed by subsequently omitting data that introduce the smallest training errors and retraining the remaining data. Iterative retraining requires more intensive computations than training a single nonsparse LS-SVM. In this paper, we propose a new pruning algorithm for sparse LS-SVMs: the sequential minimal optimization (SMO) method is introduced into pruning process; in addition, instead of determining the pruning points by errors, we omit the data points that will introduce minimum changes to a dual objective function. This new criterion is computationally efficient. The effectiveness of the proposed method in terms of computational cost and classification accuracy is demonstrated by numerical experiments.
Keywords :
computational complexity; iterative methods; least squares approximations; minimisation; pattern classification; support vector machines; SMO-based pruning methods sparse least squares support vector machines; classification accuracy; computational cost; data points; dual objective function; iterative retraining; pruning algorithm; pruning process; sequential minimal optimization method; training errors; Character generation; Computational efficiency; Computer errors; Costs; Equations; Iterative algorithms; Kernel; Least squares methods; Support vector machine classification; Support vector machines; Least squares support vector machine; pruning; sequential minimal optimization (SMO); sparseness; Algorithms; Artificial Intelligence; Computer Simulation; Least-Squares Analysis; Models, Statistical; Pattern Recognition, Automated;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2005.852239