Title :
Monotonicity and restart in fast gradient methods
Author :
Giselsson, Pontus ; Boyd, Stephen
Author_Institution :
Electr. Eng. Dept., Stanford Univ., Stanford, CA, USA
Abstract :
Fast gradient methods are known to be nonmonotone algorithms, and oscillations typically occur around the solution. To avoid this behavior, we propose in this paper a fast gradient method with restart, and analyze its convergence rate. The proposed algorithm bears similarities to other algorithms in the literature, but differs in a key point that enables theoretical convergence rate results. The efficiency of the proposed method is demonstrated by two numerical examples.
Keywords :
convergence of numerical methods; gradient methods; convergence rate; fast gradient methods; nonmonotone algorithms; oscillations; restart; Algorithm design and analysis; Convergence; Gradient methods; Prediction algorithms; Predictive control; Tin;
Conference_Titel :
Decision and Control (CDC), 2014 IEEE 53rd Annual Conference on
Conference_Location :
Los Angeles, CA
Print_ISBN :
978-1-4799-7746-8
DOI :
10.1109/CDC.2014.7040179