• DocumentCode
    14916
  • Title

    Minkovskian Gradient for Sparse Optimization

  • Author

    Amari, Shun-Ichi ; Yukawa, Masahiro

  • Author_Institution
    RIKEN Brain Sci. Inst., Wako, Japan
  • Volume
    7
  • Issue
    4
  • fYear
    2013
  • fDate
    Aug. 2013
  • Firstpage
    576
  • Lastpage
    585
  • Abstract
    Information geometry is used to elucidate convex optimization problems under L1 constraint. A convex function induces a Riemannian metric and two dually coupled affine connections in the manifold of parameters of interest. A generalized Pythagorean theorem and projection theorem hold in such a manifold. An extended LARS algorithm, applicable to both under-determined and over-determined cases, is studied and properties of its solution path are given. The algorithm is shown to be a Minkovskian gradient-descent method, which moves in the steepest direction of a target function under the Minkovskian L1 norm. Two dually coupled affine coordinate systems are useful for analyzing the solution path.
  • Keywords
    geometry; gradient methods; optimisation; signal processing; Information geometry; Minkovskian gradient-descent method; Riemannian metric; convex optimization problems; extended LARS algorithm; generalized Pythagorean theorem; projection theorem; sparse optimization; steepest direction; Convex functions; Information geometry; Joining processes; Manifolds; Measurement; Optimized production technology; Vectors; Extended LARS; L1-constraint; information geometry; sparse convex optimization;
  • fLanguage
    English
  • Journal_Title
    Selected Topics in Signal Processing, IEEE Journal of
  • Publisher
    ieee
  • ISSN
    1932-4553
  • Type

    jour

  • DOI
    10.1109/JSTSP.2013.2241014
  • Filename
    6414587