Title :
On learning context-free and context-sensitive languages
Author :
Bodén, Mikael ; Wiles, Janet
Author_Institution :
Sch. of Inf. Science, Comput. & Electr. Eng., Halmstad Univ., Sweden
fDate :
3/1/2002 12:00:00 AM
Abstract :
The long short-term memory (LSTM) is not the only neural network which learns a context sensitive language. Second-order sequential cascaded networks (SCNs) are able to induce means from a finite fragment of a context-sensitive language for processing strings outside the training set. The dynamical behavior of the SCN is qualitatively distinct from that observed in LSTM networks. Differences in performance and dynamics are discussed
Keywords :
context-free languages; context-sensitive languages; learning (artificial intelligence); recurrent neural nets; cascaded networks; context sensitive language; context-free languages; long short-term memory; neural network; recurrent neural network; Australia; Context modeling; Gold; Information science; Information technology; Neural networks; Read only memory; Recurrent neural networks; State-space methods; Testing;
Journal_Title :
Neural Networks, IEEE Transactions on