DocumentCode :
275935
Title :
Algebraic learning in syntactic neural networks
Author :
Lucas, S.
Author_Institution :
Southampton Univ., UK
fYear :
1991
fDate :
18-20 Nov 1991
Firstpage :
185
Lastpage :
189
Abstract :
The paper presents a new class of learning algorithms for syntactic neural networks (SNN), where each connection weight in the net is represented by a variable, rather than a number. As each training pattern is processed, this leads to an algebraic expression for the output; this expression is then equated to the desired net output. In this manner, the author derives a set of simultaneous equations, with an equation for each pattern in the training set. Learning is then a process of constraint satisfaction i.e. of choosing values for the variables such that the equations are solved exactly, or such that some measure of error is minimised. First the author gives a brief account of syntactic neural networks. He then explains how a training string is mapped into an algebraic expression, describes the constraint satisfaction process. Next he demonstrates a simple language inference problem, and provides some initial results. The SNN is a modular architecture, composed of simple networks called local inference machines (LIMs). Each LIM is responsible for inferring and parsing a simple grammar-subset, corresponding to a particular span of input
Keywords :
computational linguistics; context-free grammars; learning systems; neural nets; algebraic expression; connection weight; constraint satisfaction; context free grammars; grammar-subset; grammatical inference; language inference problem; learning algorithms; local inference machines; simultaneous equations; syntactic neural networks; training pattern; training set; training string;
fLanguage :
English
Publisher :
iet
Conference_Titel :
Artificial Neural Networks, 1991., Second International Conference on
Conference_Location :
Bournemouth
Print_ISBN :
0-85296-531-1
Type :
conf
Filename :
140312
Link To Document :
بازگشت