DocumentCode :
2607223
Title :
Riemannian structure of some new gradient descent learning algorithms
Author :
Mahoney, R.E. ; Williamson, Robert C.
Author_Institution :
Dept. of Electr. & Comput. Syst. Eng., Monash Univ., Clayton, Vic., Australia
fYear :
2000
fDate :
2000
Firstpage :
197
Lastpage :
202
Abstract :
We consider some generalizations of the classical LMS learning algorithm including the exponentiated gradient (EG) algorithm. We show how one can develop these algorithms in terms of a prior distribution over the weight space. Our framework subsumes the notion of “link-functions”. Differential geometric methods are used to develop the algorithms as gradient descent with respect to the natural gradient in the Riemannian structure induced by the prior distribution. This provides a Bayesian Riemannian interpretation of the EG and related algorithms. We relate our work to that of Amari (1985, 1997, 1998) and others who used similar tools in a different manner. Simulation experiments illustrating the behaviour of the new algorithms are presented
Keywords :
Bayes methods; differential geometry; gradient methods; learning (artificial intelligence); signal processing; Bayesian Riemannian interpretation; Riemannian structure; differential geometric methods; exponentiated gradient algorithm; gradient descent learning algorithms; natural gradient; prior distribution; weight space; Algorithm design and analysis; Bayesian methods; Cost function; Least squares approximation; Loss measurement; Modeling; Signal processing algorithms; Stochastic processes; Stochastic resonance; Systems engineering and theory;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Adaptive Systems for Signal Processing, Communications, and Control Symposium 2000. AS-SPCC. The IEEE 2000
Conference_Location :
Lake Louise, Alta.
Print_ISBN :
0-7803-5800-7
Type :
conf
DOI :
10.1109/ASSPCC.2000.882470
Filename :
882470
Link To Document :
بازگشت