DocumentCode :
3640266
Title :
Convergence and convergence rate of stochastic gradient search in the case of multiple and non-isolated extrema
Author :
Vladislav B. Tadić
Author_Institution :
Department of Mathematics, University of Bristol, University Walk, BS8 1TW, United Kingdom
fYear :
2010
Firstpage :
5321
Lastpage :
5326
Abstract :
The asymptotic behavior of stochastic gradient algorithms is studied. Relying on some results of differential geometry (Lojasiewicz gradient inequality), the almost sure point-convergence is demonstrated and relatively tight almost sure bounds on the convergence rate are derived. In sharp contrast to all existing result of this kind, the asymptotic results obtained here do not require the objective function (associated with the stochastic gradient search) to have an isolated minimum at which the Hessian of the objective function is strictly positive definite. Using the obtained results, the asymptotic behavior of recursive prediction error identification methods is analyzed.
Keywords :
"Convergence","Stochastic processes","Heuristic algorithms","Prediction algorithms","Signal processing algorithms","Algorithm design and analysis","Approximation methods"
Publisher :
ieee
Conference_Titel :
Decision and Control (CDC), 2010 49th IEEE Conference on
ISSN :
0743-1546
Print_ISBN :
978-1-4244-7745-6
Type :
conf
DOI :
10.1109/CDC.2010.5717836
Filename :
5717836
Link To Document :
بازگشت