Author_Institution :
Learning Syst. Dept., Siemens Corp. Res. Inc., Princeton, NJ, USA
Abstract :
Surveys learning algorithms for recurrent neural networks with hidden units and puts the various techniques into a common framework. The authors discuss fixed point learning algorithms, namely recurrent backpropagation and deterministic Boltzmann machines, and nonfixed point algorithms, namely backpropagation through time, Elman´s history cutoff, and Jordan´s output feedback architecture. Forward propagation, an on-line technique that uses adjoint equations, and variations thereof, are also discussed. In many cases, the unified presentation leads to generalizations of various sorts. The author discusses advantages and disadvantages of temporally continuous neural networks in contrast to clocked ones continues with some “tricks of the trade” for training, using, and simulating continuous time and recurrent neural networks. The author presents some simulations, and at the end, addresses issues of computational complexity and learning speed
Keywords :
Boltzmann machines; backpropagation; recurrent neural nets; Elman´s history cutoff; Jordan´s output feedback architecture; backpropagation through time; computational complexity; deterministic Boltzmann machines; dynamic recurrent neural networks; fixed point learning algorithms; forward propagation; gradient calculations; learning speed; nonfixed point algorithms; recurrent backpropagation; temporally continuous neural networks; Backpropagation algorithms; Clocks; Computational complexity; Computational modeling; Equations; History; Machine learning; Neural networks; Output feedback; Recurrent neural networks;