DocumentCode :
155622
Title :
Fast distributed coordinate descent for non-strongly convex losses
Author :
Fercoq, Olivier ; Zheng Qu ; Richtarik, Peter ; Takac, Martin
Author_Institution :
Sch. of Math., Univ. of Edinburgh, Edinburgh, UK
fYear :
2014
fDate :
21-24 Sept. 2014
Firstpage :
1
Lastpage :
6
Abstract :
We propose an efficient distributed randomized coordinate descent method for minimizing regularized non-strongly convex loss functions. The method attains the optimal O(1/k2) convergence rate, where k is the iteration counter. The core of the work is the theoretical study of stepsize parameters. We have implemented the method on Archer - the largest super-computer in the UK-and show that the method is capable of solving a (synthetic) LASSO optimization problem with 50 billion variables.
Keywords :
convergence of numerical methods; convex programming; distributed algorithms; iterative methods; minimisation; randomised algorithms; Archer; LASSO optimization problem; distributed randomized coordinate descent method; fast distributed coordinate descent; optimal O(1/k2) convergence rate; optimal iteration counter; regularized non-strongly convex loss function minimization; stepsize parameters; supercomputer; Acceleration; Big data; Complexity theory; Computers; Convergence; Support vector machines; Upper bound; Coordinate descent; acceleration; distributed algorithms;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Machine Learning for Signal Processing (MLSP), 2014 IEEE International Workshop on
Conference_Location :
Reims
Type :
conf
DOI :
10.1109/MLSP.2014.6958862
Filename :
6958862
Link To Document :
بازگشت