Title :
Parallel implementation of partially connected recurrent networks
Author :
Young, Fung-Yu ; Chan, Lai-Wan
Author_Institution :
Dept. of Comput. Sci., Chinese Univ. of Hong Kong, Shatin, Hong Kong
fDate :
27 Jun-2 Jul 1994
Abstract :
Recurrent neural networks are suitable for solving problems with temporal extent e.g. speech recognition, time series prediction, sequence generation. The biggest problem is, however, its computational complexity during the training process. Using the well-known real time recurrent learning rule of Zipser and William (1989), the training time of each epoch is of order O(n4) where n is the total number of hidden units and output units. To cope with this low efficiency, the authors have devised a new network model called the partially connected recurrent network. Using the modified learning rule, the computational complexity of training becomes only O(mn+np), where m is number of outputs and p is the number of external inputs. In addition this new recurrent model is very suitable for running on a regularly connected SIMD parallel computer since each neuron needs only to communicate with those connected to it directly. The authors have verified this idea by implementing both a ring-structured recurrent network and a grid-structured recurrent network on the DECmpp 12000Sx System with 8192 processors. It can be shown that the training time per epoch now increases only linearly, i.e. O(m+n+p), which is a big advantage for solving large scale problems
Keywords :
computational complexity; learning (artificial intelligence); parallel architectures; recurrent neural nets; DECmpp 12000Sx System; computational complexity; grid-structured recurrent network; hidden units; large scale problems; output units; parallel implementation; partially connected recurrent networks; regularly connected SIMD parallel computer; ring-structured recurrent network; training process; Computational complexity; Computer networks; Computer science; Concurrent computing; Large-scale systems; Natural languages; Neurons; Recurrent neural networks; Shape; Speech recognition;
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
DOI :
10.1109/ICNN.1994.374530