Title :
Second-order recurrent neural networks for grammatical inference
Author :
Giles, C.L. ; Chen, D. ; Miller, C.B. ; Chen, H.H. ; Sun, G.Z. ; Lee, Y.C.
Author_Institution :
NEC Res. Inst., Princeton, NJ, USA
Abstract :
It is shown that a recurrent, second-order neural network using a real-time, feedforward training algorithm readily learns to infer regular grammars from positive and negative string training samples. Numerous simulations which show the effect of initial conditions, training set size and order, and neuron architecture are presented. All simulations were performed with random initial weight strengths and usually converge after approximately a hundred epochs of training. The authors discuss a quantization algorithm for dynamically extracting finite-state automata during and after training. For a well-trained neural net, the extracted automata constitute an equivalence class of state machines that are reducible to the minimal machine of the inferred grammar. It is then shown through simulations that many of the neural net state machines are dynamically stable and correctly classify long unseen strings
Keywords :
finite automata; grammars; inference mechanisms; learning systems; neural nets; real-time systems; feedforward training; finite-state automata; negative string training; neuron architecture; positive string training; real-time; second order recurrent neural nets; state machines; Educational institutions; Feedforward neural networks; Feedforward systems; Inference algorithms; Learning automata; National electric code; Neural networks; Neurons; Production; Recurrent neural networks;
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
DOI :
10.1109/IJCNN.1991.155350