Title :
Showing the equivalence of two training algorithms. I
Author :
Koch, M. ; Fischer, I. ; Berthold, M.R.
Author_Institution :
Tech. Univ. Berlin, Germany
Abstract :
Graph transformations offer a powerful way to formally specify neural networks and their corresponding training algorithms. This formalism can be used to prove properties of these algorithms. In this paper graph transformations are used to show the equivalence of two training algorithms for recurrent neural networks; backpropagation through time, and a variant of real-time backpropagation. In addition to this proof a whole class of related training algorithms emerges from the used formalism
Keywords :
backpropagation; graph theory; recurrent neural nets; transforms; graph transformations; neural networks; real-time backpropagation; recurrent neural networks; training algorithm equivalence; Backpropagation algorithms; Computer networks; Convergence; Flow graphs; Labeling; Network topology; Neural networks; Recurrent neural networks;
Conference_Titel :
Neural Networks Proceedings, 1998. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on
Conference_Location :
Anchorage, AK
Print_ISBN :
0-7803-4859-1
DOI :
10.1109/IJCNN.1998.682308