Title :
Parallel Coordinate Descent for the Adaboost Problem
Author_Institution :
Sch. of Math., Univ. of Edinburgh, Edinburgh, UK
Abstract :
We design a randomised parallel version of Adaboost based on previous studies on parallel coordinate descent. The algorithm uses the fact that the logarithm of the exponential loss is a function with coordinate-wise Lipschitz continuous gradient, in order to define the step lengths. We provide the proof of convergence for this randomised Adaboost algorithm and a theoretical parallelisation speedup factor. We finally provide numerical examples on learning problems of various sizes that show that the algorithm is competitive with concurrent approaches, especially for large scale problems.
Keywords :
learning (artificial intelligence); parallel algorithms; randomised algorithms; concurrent approach; coordinate-wise Lipschitz continuous gradient; exponential loss; parallel coordinate descent; parallelisation speedup factor; randomised Adaboost algorithm; randomised parallel version; Acceleration; Algorithm design and analysis; Boosting; Complexity theory; Convergence; Minimization; Optimization; Adaboost; iteration complexity; parallel algorithm; randomised coordinate descent;
Conference_Titel :
Machine Learning and Applications (ICMLA), 2013 12th International Conference on
Conference_Location :
Miami, FL
DOI :
10.1109/ICMLA.2013.72