Title :
A convergence rate estimate for the SVM decomposition method
Author :
Lai, D. ; Shilton, A. ; Mani, N. ; Palaniswami, Marimuthu
Author_Institution :
Dept. of Electr. & Comput. Syst. Eng., Monash Univ., Clayton, Vic., Australia
fDate :
31 July-4 Aug. 2005
Abstract :
The training of support vector machines using the decomposition method has one drawback; namely the selection of working sets such that convergence is as fast as possible. It has been shown by Lin that the rate is linear in the worse case under the assumption that all bounded support vectors have been determined. The analysis was done based on the change in the objective function and under a SVMlight selection rule. However, the rate estimate given is independent of time and hence gives little indication as to how the linear convergence speed varies during the iteration. In this initial analysis, we provide a treatment of the convergence from a gradient contraction perspective. We propose a necessary and sufficient condition which when satisfied provides strict linear convergence of the algorithm. The condition can also be interpreted as a basic requirement for a sequence of working sets in order to achieve such a convergence rate. Based on this condition, a time dependent rate estimate is then further derived. This estimate is shown to monotonically approach unity from below.
Keywords :
convergence; gradient methods; support vector machines; SVM decomposition; convergence rate estimation; gradient contraction perspective; linear convergence speed; time dependent rate estimate; Convergence; Lagrangian functions; Pattern recognition; Quadratic programming; Strontium; Sufficient conditions; Supervised learning; Support vector machine classification; Support vector machines; Systems engineering and theory;
Conference_Titel :
Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on
Print_ISBN :
0-7803-9048-2
DOI :
10.1109/IJCNN.2005.1555977