DocumentCode :
3417135
Title :
Inserting rules into recurrent neural networks
Author :
Giles, C.L. ; Omlin, C.W.
Author_Institution :
NEC Res. Inst., Princeton, NJ, USA
fYear :
1992
fDate :
31 Aug-2 Sep 1992
Firstpage :
13
Lastpage :
22
Abstract :
The authors present a method that incorporates a priori knowledge in the training of recurrent neural networks. This a priori knowledge can be interpreted as hints about the problem to be learned and these hints are encoded as rules which are then inserted into the neural network. The authors demonstrate the approach by training recurrent neural networks with inserted rules to learn to recognize regular languages from grammatical string examples. Because the recurrent networks have second-order connections, rule-insertion is a straightforward mapping of rules into weights and neurons. Simulations show that training recurrent networks with different amounts of partial knowledge to recognize simple grammers improves the training times by orders of magnitude, even when only a small fraction of all transitions are inserted as rules. In addition, there appears to be no loss in generalization performance
Keywords :
deterministic automata; finite automata; grammars; learning (artificial intelligence); recurrent neural nets; deterministic automata; finite automata; grammatical string examples; inserted rules; learning; recurrent neural networks; regular languages; second-order connections; training; Computer science; Doped fiber amplifiers; Economic indicators; Feedforward systems; Learning automata; National electric code; Neural networks; Neurons; Performance loss; Recurrent neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks for Signal Processing [1992] II., Proceedings of the 1992 IEEE-SP Workshop
Conference_Location :
Helsingoer
Print_ISBN :
0-7803-0557-4
Type :
conf
DOI :
10.1109/NNSP.1992.253712
Filename :
253712
Link To Document :
بازگشت