Title :
Supervised learning of regular languages by neural networks
Author_Institution :
Inst. Galilee, Univ. de Paris-Nord, Villetaneuse, France
fDate :
27 Jun-2 Jul 1994
Abstract :
Introduces a new method for learning regular languages using neural networks. The method is based on two algorithms and assumes that the size of the alphabet is constant. The first algorithm constructs the initial neural network from the maximal length (i.e. M) of the words, belonging in a positive sample, and the size of the alphabet. The second algorithm (i.e. the learning algorithm) constructs the final neural network by adjusting the weights of the initial neural network. The author proposes a method of time complexity O(n2) which ameliorates, for a particular kind of regular languages, that of Angluin [Angluin 1987] which is of time complexity O(mn4+m2n3). The author shows that the the fact of constructing neural networks instead of finite automata reduces the computation time and activates the learning process
Keywords :
formal languages; learning (artificial intelligence); neural nets; learning algorithm; neural networks; regular languages; supervised learning; time complexity; Computer languages; Computer networks; Electronic mail; Graphics; Information retrieval; Learning automata; Neural networks; Pattern recognition; Supervised learning; Testing;
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
DOI :
10.1109/ICNN.1994.374290