Title :
Remembering the past: the role of embedded memory in recurrent neural network architectures
Author :
C.L. Giles; Tsungnan Lin;B.G. Horne
Author_Institution :
NEC Res. Inst., Princeton, NJ, USA
Abstract :
There has been much interest in learning long-term temporal dependencies with neural networks. Adequately learning such long-term information can be useful in many problems in signal processing, control and prediction. A class of recurrent neural networks (RNNs), NARX neural networks, were shown to perform much better than other recurrent neural networks when learning simple long-term dependency problems. The intuitive explanation is that the output memories of a NARX network can be manifested as jump-ahead connections in the time-unfolded network. Here we show that similar improvements in learning long-term dependencies can be achieved with other classes of recurrent neural network architectures simply by increasing the order of the embedded memory. Experiments with locally recurrent networks, and NARX (output feedback) networks show that all of these classes of network architectures can have a significant improvement on learning long-term dependencies as the orders of embedded memory are increased, other things be held constant. These results can be important to a user comfortable with a specific recurrent neural network architecture because simply increasing the embedding memory order of that architecture will make it more robust to the problem of long-term dependency learning.
Keywords :
"Intelligent networks","Recurrent neural networks","Neural networks","Computer architecture","Backpropagation algorithms","National electric code","Educational institutions","Signal processing","Process control","Output feedback"
Conference_Titel :
Neural Networks for Signal Processing [1997] VII. Proceedings of the 1997 IEEE Workshop
Print_ISBN :
0-7803-4256-9
DOI :
10.1109/NNSP.1997.622381