DocumentCode :
2700720
Title :
Learning sequential structure with recurrent pRAM nets
Author :
Gorse, D. ; Taylor, J.G.
Author_Institution :
Dept. of Comput. Sci., Univ. Coll., London, UK
fYear :
1991
fDate :
8-14 Jul 1991
Firstpage :
37
Abstract :
Networks of probabilistic RAMs (pRAMs) may be trained using both gradient descent and reinforcement training rules. These two approaches are applied to the problem of learning a simple grammar from exposure to a finite set of grammatically correct strings, and it is seen that the combination of nonlinearity and stochasticity in the pRAM output functions enables a recurrent network to learn the grammar quickly and accurately. Both gradient descent and reinforcement training can be used to train a pRAM net to recognize a simple regular grammar (dual parity) with a speed which greatly exceeds that which can be achieved with networks of more conventional processor. In particular, it was demonstrated that the use of the stochastic features of the pRAM in reinforcement training would lead to a very significant reduction in training time if the system were implemented in hardware
Keywords :
learning systems; neural nets; random-access storage; dual parity; gradient descent; grammar; grammatically correct strings; nonlinearity; probabilistic RAMs; recurrent pRAM nets; reinforcement training rules; sequential structure; stochasticity; Computer science; Educational institutions; Frequency; Hardware; Machine learning; Mathematics; Phase change random access memory; Random access memory; Read-write memory; Stochastic processes;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
Type :
conf
DOI :
10.1109/IJCNN.1991.155309
Filename :
155309
Link To Document :
بازگشت