DocumentCode :
527498
Title :
Approximation performance analysis of recurrent neural networks
Author :
Cong, Shuang ; Yu, Ming ; Dai, Yi
Author_Institution :
Dept. of Autom., Univ. of Sci. & Technol. of China, Hefei, China
Volume :
2
fYear :
2010
fDate :
10-12 Aug. 2010
Firstpage :
1074
Lastpage :
1078
Abstract :
On the basis of the transformation from the space state model into the input/output model for the general recurrent neural networks, we prove that recurrent networks may realize entire approximation to arbitrary non-linear property under some conditions. And point out that in order to realize arbitrary non-linear function approximation using recurrent neural networks, the initial conditions, the number of node in hidden layer and the approximation effectiveness must be considered. The complete network design process is given through the numerical example to verify the results obtained.
Keywords :
recurrent neural nets; approximation performance analysis; arbitrary nonlinear function approximation; general recurrent neural networks; hidden layer; space state model; Artificial neural networks; Function approximation; Mathematical model; Recurrent neural networks; Testing; Training; function approximation; input/output model; recurrent neural networks; space state model;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Natural Computation (ICNC), 2010 Sixth International Conference on
Conference_Location :
Yantai, Shandong
Print_ISBN :
978-1-4244-5958-2
Type :
conf
DOI :
10.1109/ICNC.2010.5582999
Filename :
5582999
Link To Document :
بازگشت