DocumentCode :
1735020
Title :
Parallel Coordinate Descent for the Adaboost Problem
Author :
Fercoq, Olivier
Author_Institution :
Sch. of Math., Univ. of Edinburgh, Edinburgh, UK
Volume :
1
fYear :
2013
Firstpage :
354
Lastpage :
358
Abstract :
We design a randomised parallel version of Adaboost based on previous studies on parallel coordinate descent. The algorithm uses the fact that the logarithm of the exponential loss is a function with coordinate-wise Lipschitz continuous gradient, in order to define the step lengths. We provide the proof of convergence for this randomised Adaboost algorithm and a theoretical parallelisation speedup factor. We finally provide numerical examples on learning problems of various sizes that show that the algorithm is competitive with concurrent approaches, especially for large scale problems.
Keywords :
learning (artificial intelligence); parallel algorithms; randomised algorithms; concurrent approach; coordinate-wise Lipschitz continuous gradient; exponential loss; parallel coordinate descent; parallelisation speedup factor; randomised Adaboost algorithm; randomised parallel version; Acceleration; Algorithm design and analysis; Boosting; Complexity theory; Convergence; Minimization; Optimization; Adaboost; iteration complexity; parallel algorithm; randomised coordinate descent;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Machine Learning and Applications (ICMLA), 2013 12th International Conference on
Conference_Location :
Miami, FL
Type :
conf
DOI :
10.1109/ICMLA.2013.72
Filename :
6784642
Link To Document :
بازگشت