DocumentCode :
1727520
Title :
Towards the open ended evolution of neural networks
Author :
Lucas, S.M.
Author_Institution :
Essex Univ., Colchester, UK
fYear :
1995
Firstpage :
388
Lastpage :
393
Abstract :
A framework is described that allows the completely open-ended evolution of neural network architectures, based on an active weight neural network model. In this approach, there is no separate learning algorithm; learning proceeds (if at all) as an intrinsic part of the network behaviour. This has interesting application in the evolution of neural nets, since now it is possible to evolve all aspects of a network (including the learning `algorithm´) within a single unified paradigm. As an example, a grammar is given for growing a multilayer perceptron with active weights that has the error back-propagation learning algorithm embedded in its structure
Keywords :
genetic algorithms; neural nets; active weight; error back-propagation; learning; multilayer perceptron; neural networks; open ended evolution;
fLanguage :
English
Publisher :
iet
Conference_Titel :
Genetic Algorithms in Engineering Systems: Innovations and Applications, 1995. GALESIA. First International Conference on (Conf. Publ. No. 414)
Conference_Location :
Sheffield
Print_ISBN :
0-85296-650-4
Type :
conf
DOI :
10.1049/cp:19951080
Filename :
501703
Link To Document :
بازگشت