DocumentCode :
1263972
Title :
On learning context-free and context-sensitive languages
Author :
Bodén, Mikael ; Wiles, Janet
Author_Institution :
Sch. of Inf. Science, Comput. & Electr. Eng., Halmstad Univ., Sweden
Volume :
13
Issue :
2
fYear :
2002
fDate :
3/1/2002 12:00:00 AM
Firstpage :
491
Lastpage :
493
Abstract :
The long short-term memory (LSTM) is not the only neural network which learns a context sensitive language. Second-order sequential cascaded networks (SCNs) are able to induce means from a finite fragment of a context-sensitive language for processing strings outside the training set. The dynamical behavior of the SCN is qualitatively distinct from that observed in LSTM networks. Differences in performance and dynamics are discussed
Keywords :
context-free languages; context-sensitive languages; learning (artificial intelligence); recurrent neural nets; cascaded networks; context sensitive language; context-free languages; long short-term memory; neural network; recurrent neural network; Australia; Context modeling; Gold; Information science; Information technology; Neural networks; Read only memory; Recurrent neural networks; State-space methods; Testing;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.991436
Filename :
991436
Link To Document :
بازگشت