Title :
A learning algorithm for improved recurrent neural networks
Author :
Chen, C.H. ; Yu, Liwen
Author_Institution :
Dept. of Electr. & Comput. Eng., Massachusetts Univ., N. Dartmouth, MA, USA
Abstract :
An improved recurrent neural network structure is proposed. The exact form of a gradient-following learning algorithm for the continuously running neural networks is derived for temporal supervised learning tasks. The algorithm allows networks to learn complex tasks that require the retention of information over time periods. The algorithm also compensates for the information that is missed by the traditional recurrent neural networks. Empirical results show that the networks trained using this algorithm have improved prediction performance over the backpropagation trained network and the Elman recurrent neural network
Keywords :
forecasting theory; learning (artificial intelligence); recurrent neural nets; time series; Elman recurrent neural network; backpropagation trained network; continuously running neural networks; gradient-following learning algorithm; prediction performance; temporal supervised learning tasks; Backpropagation algorithms; Computer architecture; Computer networks; Erbium; Joining processes; Neural networks; Neurons; Prediction algorithms; Recurrent neural networks; Supervised learning;
Conference_Titel :
Neural Networks,1997., International Conference on
Conference_Location :
Houston, TX
Print_ISBN :
0-7803-4122-8
DOI :
10.1109/ICNN.1997.614249