Abstract :
Parameters optimization selection is a key point in Support Vector Regression (SVR). Exhaustive search spends a lot of time, especially when large-scale samples need to be trained. A new method based on Parameters Subsection Selection and Self-Calling (PSS-SC) SVR is proposed. First of all, parameters optimization selection involves in penalty coefficient c, kernel parameter g and non-sensitive coefficient p, and the combination (c,g,p) will make a great effect on the prediction accuracy of SVR. The proposed method is used to select the optimal parameter combination with less time to achieve the better performance of SVR. Firstly, trisection is adopted according to the span of each parameter, thus, three medians as test points could be available for each parameter. Totally 27 parameter combinations (c,g,p) and MSEs of corresponding SVRs could be achieved. Then the mapping relationship between the 27 combinations (c,g,p) and their MSEs could be established. And then, the MSEs of the remaining parameter combinations could be conducted with the mapping relationship. Thus, the N parameters combinations corresponding to the first N minimum MSEs are selected as the candidates TOP-N. Finally, the TOP-N combinations (c,g,p) are applied to SVR to achieve their MSEs separately. The minimum MSE corresponds to the best parameter combination. Experiments on 5 benchmark datasets illustrate that the new method not only can assure the prediction precision but also can reduce training time extremely.
Keywords :
mean square error methods; optimisation; regression analysis; support vector machines; MSE; TOP-N combinations; mapping relationship; nonsensitive coefficient; optimal parameter combination; parameter optimization method; parameter optimization selection; parameter subsection selection; penalty coefficient; self-calling SVR; support vector regression; Support Vector Regression (SVR); large-scale samples; parameter selection; subsection;