• DocumentCode
    3743136
  • Title

    Distributed and parallel random coordinate descent methods for huge convex programming over networks

  • Author

    Ion Necoara

  • Author_Institution
    Automatic Control and Systems Engineering Department, University Politehnica Bucharest, 060042, Romania
  • fYear
    2015
  • Firstpage
    425
  • Lastpage
    430
  • Abstract
    In this paper we develop parallel random coordinate gradient descent methods for minimizing huge linearly constrained separable convex problems over networks. Since we have coupled constraints in the problem, we devise a family of algorithms that updates in parallel τ ≥ 2 (block) components per iteration. Moreover, the algorithms are adequate for distributed and parallel computations and their complexity per iteration is cheaper than of the full gradient method when the number of nodes N in the network is huge. We prove that for these methods we obtain in expectation an o-accurate solution in at most O(N over τε) iterations and thus the convergence rate depends linearly on the number of (block) components to be updated. We also describe several applications that fit in our framework, in particular the convex feasibility problem. Numerically, we show that the parallel coordinate descent method with τ > 2 accelerates on its basic counterpart corresponding to τ = 2.
  • Keywords
    "Linear programming","Optimization","Algorithm design and analysis","Convergence","Eigenvalues and eigenfunctions","Complexity theory","Convex functions"
  • Publisher
    ieee
  • Conference_Titel
    Decision and Control (CDC), 2015 IEEE 54th Annual Conference on
  • Type

    conf

  • DOI
    10.1109/CDC.2015.7402237
  • Filename
    7402237