DocumentCode :
2754814
Title :
Asynchronous learning dynamics in massively parallel recurrent neural networks
Author :
Wu, Chwan-Hwa ; Tsai, Jyun-Hwei
Author_Institution :
Dept. of Electr. Eng., Auburn Univ., AL, USA
fYear :
1991
fDate :
8-14 Jul 1991
Abstract :
Summary form only given. A mathematical basis of the concurrent asynchronous relaxation method for the parallel learning of recurrent neural networks has been proposed. The condition for the asynchronous relaxation learning method of recurrent networks to converge in multiprocessor systems was developed based on partially asynchronous gradient descent optimization theory. The parallel learning of recurrent neural networks was successfully implemented on a CRAY X-MP using Macrotasking and an iPSC/2 using asynchronous communication. The recurrent neural network is trained to learn the behavior of a class of aperiodic or chaotic nonlinear differential-delay equations by Mackey and Glass
Keywords :
dynamics; learning systems; neural nets; optimisation; parallel processing; CRAY X-MP; Glass; Mackey; Macrotasking; asynchronous communication; asynchronous learning dynamics; asynchronous relaxation learning; chaotic nonlinear differential-delay equations; concurrent asynchronous relaxation; iPSC/2; massively parallel recurrent neural networks; parallel learning; partially asynchronous gradient descent optimization; Asynchronous communication; Chaotic communication; Differential equations; Glass; Learning systems; Multiprocessing systems; Nonlinear equations; Optimization methods; Recurrent neural networks; Relaxation methods;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
Type :
conf
DOI :
10.1109/IJCNN.1991.155645
Filename :
155645
Link To Document :
بازگشت