DocumentCode :
2293987
Title :
Stabilizing and Improving the Learning Speed of 2-Layered LSTM Network
Author :
Correa, Debora C. ; Levada, Alexandre L M ; Saito, J.H.
Author_Institution :
Dept. de Computacaodo, Univ. Fed. de Sao Carlos, Sao Carlos
fYear :
2008
fDate :
16-18 July 2008
Firstpage :
293
Lastpage :
300
Abstract :
This paper presents a novel method to initialize the LSTM network weights in order to improve and stabilize the learning speed, based on Nguyen and Widrowpsilas work for MLP networks. The derived equations for weight initialization are based on the study of the behavior of the memory cells output in the hidden layer. To test and evaluate the proposed method, we use a 2-Layered LSTM network to approximate one and two dimensional real non-linear functions. The obtained results show that our initialization method improves the training process.
Keywords :
learning (artificial intelligence); multilayer perceptrons; stability; 2-layered LSTM network; learning speed stabilization; long-short-term memory network; memory cell; multilayer perceptron network; Computer networks; Degradation; Error correction; Logistics; Neural networks; Nonlinear equations; Recurrent neural networks; Testing; LSTM; learning; neural networks; weight initialization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computational Science and Engineering, 2008. CSE '08. 11th IEEE International Conference on
Conference_Location :
Sao Paulo
Print_ISBN :
978-0-7695-3193-9
Type :
conf
DOI :
10.1109/CSE.2008.32
Filename :
4578245
Link To Document :
بازگشت