Title :
New Developments on Recurrent Neural Networks Training
Author :
Pattamavorakun, S. ; Pattamavorakun, S.
Author_Institution :
Rajamangala Univ. of Technol., Pathumthani
Abstract :
A new algorithm is proposed for improving the convergence of recurrent neural networks. This algorithm is obtained by combining the methods of weight update of Atiya-Parlos algorithm (the algorithm find the direction of weight change by approximation), and Y-N algorithm technique (the algorithm estimate fictitious target signals of hidden nodes to update hidden weight separately from output weights), and then by adding the error self-recurrent (ESR) network to improve the error functions (to speed up the convergence and not sensitive to initial weight by calculating the errors from output unit and then these errors are fed back for determining weight updates of output unit nodes). The results showed that both fully RNN and partially RNNs on some selected and the proposed algorithm could forecast the daily flow data quite satisfactorily.
Keywords :
recurrent neural nets; error functions; error self-recurrent network; recurrent neural networks training; Approximation algorithms; Backpropagation algorithms; Error correction; Gradient methods; Learning; Management training; Neural networks; Neurofeedback; Neurons; Recurrent neural networks;
Conference_Titel :
Software Engineering Research, Management & Applications, 2007. SERA 2007. 5th ACIS International Conference on
Conference_Location :
Busan
Print_ISBN :
0-7695-2867-8
DOI :
10.1109/SERA.2007.102