Title :
Grammatical inference using higher order recurrent neural networks
Author :
Harigopal, Umesh ; Chen, H.C.
Author_Institution :
Dept. of Comput. Sci., Alabama Univ., Tuscaloosa, AL, USA
Abstract :
As a step towards proving theorems on the capabilities of a connectionist cognitive model, a higher order recurrent network´s capabilities and limitations in learning to perform grammatical inference on regular languages is evaluated. It is seen that the recurrent neural network can learn to recognize an unknown regular grammar, and the finite state automata (FSA) corresponding to the learned grammar is extractable from the state space of the network. The proposed network can also recognize strings much longer than it was trained on. A straightforward method for incorporating partial knowledge of a deterministic FSA in the recurrent network was implemented, to see its effect on the convergence time. Simulation results show a substantial improvement in the rate of convergence of the network
Keywords :
cognitive systems; convergence; finite automata; formal languages; grammars; inference mechanisms; recurrent neural nets; state-space methods; theorem proving; connectionist cognitive model; convergence time; finite state automata; grammatical inference; higher order recurrent neural networks; regular languages; state space; strings; Computer science; Context modeling; Convergence; Learning automata; Natural language processing; Neural networks; Neurons; Performance evaluation; Recurrent neural networks; State-space methods;
Conference_Titel :
System Theory, 1993. Proceedings SSST '93., Twenty-Fifth Southeastern Symposium on
Conference_Location :
Tuscaloosa, AL
Print_ISBN :
0-8186-3560-6
DOI :
10.1109/SSST.1993.522798