Title :
Fast distributed coordinate descent for non-strongly convex losses
Author :
Fercoq, Olivier ; Zheng Qu ; Richtarik, Peter ; Takac, Martin
Author_Institution :
Sch. of Math., Univ. of Edinburgh, Edinburgh, UK
Abstract :
We propose an efficient distributed randomized coordinate descent method for minimizing regularized non-strongly convex loss functions. The method attains the optimal O(1/k2) convergence rate, where k is the iteration counter. The core of the work is the theoretical study of stepsize parameters. We have implemented the method on Archer - the largest super-computer in the UK-and show that the method is capable of solving a (synthetic) LASSO optimization problem with 50 billion variables.
Keywords :
convergence of numerical methods; convex programming; distributed algorithms; iterative methods; minimisation; randomised algorithms; Archer; LASSO optimization problem; distributed randomized coordinate descent method; fast distributed coordinate descent; optimal O(1/k2) convergence rate; optimal iteration counter; regularized non-strongly convex loss function minimization; stepsize parameters; supercomputer; Acceleration; Big data; Complexity theory; Computers; Convergence; Support vector machines; Upper bound; Coordinate descent; acceleration; distributed algorithms;
Conference_Titel :
Machine Learning for Signal Processing (MLSP), 2014 IEEE International Workshop on
Conference_Location :
Reims
DOI :
10.1109/MLSP.2014.6958862