Title :
A note on the decomposition methods for support vector regression
Author :
Liao, Shuo-Peng ; Lin, Hsuan-Tien ; Lin, Chih-Jen
Author_Institution :
Dept. of Comput. Sci. & Inf. Eng., Nat. Taiwan Univ., Taipei, Taiwan
Abstract :
The dual formulation of support vector regression involves with two closely related sets of variables. When the decomposition method is used, many existing approaches use pairs of indices from these two sets as the working set. Basically they select a base set first and then expand it so that all indices are pairs. This makes the implementation different from that for support vector classification. In addition, a larger optimization sub-problem has to be solved in each iteration. In this paper from different aspects we demonstrate that there are no needs to do so. In particular we show that directly using this base set as the working set leads to similar convergence (number of iterations). Therefore, not only the program can be simpler, with a smaller working set and similar number of iterations, it can also be more efficient
Keywords :
convergence; duality (mathematics); iterative methods; learning automata; statistical analysis; SVM; convergence; decomposition methods; dual formulation; iteration; support vector regression; Computer science; Convergence; Costs; Lagrangian functions; Upper bound;
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-7044-9
DOI :
10.1109/IJCNN.2001.939580