Title :
Improved Sparse Least Square Support Vector Machines for the Function Estimation
Author :
Zhang, Yafeng ; Yu, Songnian
Author_Institution :
Shanghai Univ., Shanghai, China
Abstract :
Least square support vector machines (LS-SVMs) are deemed good methods for classification and function estimation. Compared with the standard support vector machines (SVMs), a drawback is that the sparseness is lost in the LS-SVMs. The sparseness is imposed by omitting the less important data in the training process, and retraining the remaining data. Iterative retraining requires more intensive computations than training the non-sparse LS-SVMs. In this paper, we will describe a new pruning algorithm for LS-SVMs: the width of ε-insensitive zone is introduced in the process of the training; in addition, the amount of the pruning points is adjusted according to the performance of training, not specified using the fixed percentage of training data; furthermore, the cross training is applied in the training. The performance of improved LS-SVMs pruning algorithm, in terms of computational cost and regress accuracy, is shown by several experiments based on the same data sets of chaotic time series.
Keywords :
function evaluation; iterative methods; least squares approximations; pattern classification; support vector machines; time series; ε-insensitive zone; chaotic time series; function estimation; iterative retraining; least square support vector machine; pruning algorithm; Computational efficiency; Equations; Function approximation; Kernel; Least squares approximation; Machine intelligence; Quadratic programming; Support vector machine classification; Support vector machines; Training data; LS-SVMs; einsensitive zone; function estimation; improved sparse LS-SVMs; pruning;
Conference_Titel :
Intelligent Computation Technology and Automation (ICICTA), 2010 International Conference on
Conference_Location :
Changsha
Print_ISBN :
978-1-4244-7279-6
Electronic_ISBN :
978-1-4244-7280-2
DOI :
10.1109/ICICTA.2010.269