DocumentCode :
3224911
Title :
Grammatical inference using higher order recurrent neural networks
Author :
Harigopal, Umesh ; Chen, H.C.
Author_Institution :
Dept. of Comput. Sci., Alabama Univ., Tuscaloosa, AL, USA
fYear :
1993
fDate :
7-9 Mar 1993
Firstpage :
338
Lastpage :
342
Abstract :
As a step towards proving theorems on the capabilities of a connectionist cognitive model, a higher order recurrent network´s capabilities and limitations in learning to perform grammatical inference on regular languages is evaluated. It is seen that the recurrent neural network can learn to recognize an unknown regular grammar, and the finite state automata (FSA) corresponding to the learned grammar is extractable from the state space of the network. The proposed network can also recognize strings much longer than it was trained on. A straightforward method for incorporating partial knowledge of a deterministic FSA in the recurrent network was implemented, to see its effect on the convergence time. Simulation results show a substantial improvement in the rate of convergence of the network
Keywords :
cognitive systems; convergence; finite automata; formal languages; grammars; inference mechanisms; recurrent neural nets; state-space methods; theorem proving; connectionist cognitive model; convergence time; finite state automata; grammatical inference; higher order recurrent neural networks; regular languages; state space; strings; Computer science; Context modeling; Convergence; Learning automata; Natural language processing; Neural networks; Neurons; Performance evaluation; Recurrent neural networks; State-space methods;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
System Theory, 1993. Proceedings SSST '93., Twenty-Fifth Southeastern Symposium on
Conference_Location :
Tuscaloosa, AL
ISSN :
0094-2898
Print_ISBN :
0-8186-3560-6
Type :
conf
DOI :
10.1109/SSST.1993.522798
Filename :
522798
Link To Document :
بازگشت