DocumentCode :
2314521
Title :
Recurrent nets that time and count
Author :
Gers, Felix A. ; Schmidhuber, Jürgen
Author_Institution :
IDSIA, Lugano, Switzerland
Volume :
3
fYear :
2000
fDate :
2000
Firstpage :
189
Abstract :
The size of the time intervals between events conveys information essential for numerous sequential tasks such as motor control and rhythm detection. While hidden Markov models tend to ignore this information, recurrent neural networks (RNN) can in principle learn to make use of it. We focus on long short-term memory (LSTM) because it usually outperforms other RNN. Surprisingly, LSTM augmented by “peephole connections” from its internal cells to its multiplicative gates can learn the fine distinction between sequences of spikes separated by either 50 or 49 discrete time steps, without the help of any short training exemplars. Without external resets or teacher forcing or loss of performance on tasks reported earlier, our LSTM variant also learns to generate very stable sequences of highly nonlinear, precisely timed spikes. This makes LSTM a promising approach for real-world tasks that require to time and count
Keywords :
counting circuits; learning (artificial intelligence); recurrent neural nets; sequences; timing; LSTM; RNN; counting; discrete time steps; highly-nonlinear precisely-timed spike sequences; internal cells; long short-term memory; motor control; multiplicative gates; peephole connections; recurrent neural networks; rhythm detection; sequential tasks; stable sequences; time intervals; timing; Delay; Event detection; Hidden Markov models; Humans; Motor drives; Pattern recognition; Performance loss; Recurrent neural networks; Rhythm; World Wide Web;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
Conference_Location :
Como
ISSN :
1098-7576
Print_ISBN :
0-7695-0619-4
Type :
conf
DOI :
10.1109/IJCNN.2000.861302
Filename :
861302
Link To Document :
بازگشت