Title :
An efficient active set method for SVM training without singular inner problems
Author :
Sentelle, Christopher ; Anagnostopoulos, Georgios C. ; Georgiopoulos, Michael
Abstract :
Efficiently implemented active set methods have been successfully applied to support vector machine (SVM) training. These active set methods offer higher precision and incremental training at the cost of additional memory requirements when compared to decomposition methods such as sequential minimal optimization (SMO). However, all existing active set methods must deal with singularities occurring within the inner problem solved at each iteration, a problem that leads to more complex implementation and potential inefficiencies. Here, we introduce a revised simplex method, originally introduced by Rusin, adapted for SVM training and show this is an active set method similar to most existing methods with the advantage of maintaining nonsingularity of the inner problem. We compare performance to an existing active set method introduced by Scheinberg and demonstrate an improvement in training times, in some cases. We show our method maintains a slightly simpler implementation and offers advantages in terms of applying iterative methods to alleviate memory concerns. We also show performance of the active set methods when compared to state-of-the-art decomposition implementations such as SVMLight and SMO.
Keywords :
iterative methods; optimisation; support vector machines; SVM training; SVMLight; active set method; incremental training; iterative method; memory requirement; revised simplex method; sequential minimal optimization; support vector machine; Convergence; Cost function; Equations; Iterative methods; Kernel; Neural networks; Optimization methods; Pricing; Quadratic programming; Support vector machines;
Conference_Titel :
Neural Networks, 2009. IJCNN 2009. International Joint Conference on
Conference_Location :
Atlanta, GA
Print_ISBN :
978-1-4244-3548-7
Electronic_ISBN :
1098-7576
DOI :
10.1109/IJCNN.2009.5178948