DocumentCode :
2213579
Title :
Showing the equivalence of two training algorithms. I
Author :
Koch, M. ; Fischer, I. ; Berthold, M.R.
Author_Institution :
Tech. Univ. Berlin, Germany
Volume :
1
fYear :
1998
fDate :
4-8 May 1998
Firstpage :
447
Abstract :
Graph transformations offer a powerful way to formally specify neural networks and their corresponding training algorithms. This formalism can be used to prove properties of these algorithms. In this paper graph transformations are used to show the equivalence of two training algorithms for recurrent neural networks; backpropagation through time, and a variant of real-time backpropagation. In addition to this proof a whole class of related training algorithms emerges from the used formalism
Keywords :
backpropagation; graph theory; recurrent neural nets; transforms; graph transformations; neural networks; real-time backpropagation; recurrent neural networks; training algorithm equivalence; Backpropagation algorithms; Computer networks; Convergence; Flow graphs; Labeling; Network topology; Neural networks; Recurrent neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks Proceedings, 1998. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on
Conference_Location :
Anchorage, AK
ISSN :
1098-7576
Print_ISBN :
0-7803-4859-1
Type :
conf
DOI :
10.1109/IJCNN.1998.682308
Filename :
682308
Link To Document :
بازگشت