DocumentCode
424024
Title
Generational versus steady-state evolution for optimizing neural network learning
Author
Bullinaria, J.A.
Author_Institution
School of Computer Science, The University of Birmingham
Volume
3
fYear
2004
fDate
25-29 July 2004
Firstpage
2297
Abstract
The use of simulated evolution is now a commonplace technique for optimizing the learning abilities of neural network systems. Neural network details such as architecture, initial weight distributions, gradient descent learning rates, and regularization parameters, have all been successfully evolved to result in improved performance. The author investigates which evolutionary approaches work best in this field. In particular, he compares the traditional generational approach to a more biologically realistic steady-state approach.
Keywords
gradient methods; learning (artificial intelligence); neural net architecture; optimisation; biologically realistic steady state method; generational approach; gradient descent learning rates; initial weight distributions; neural net architecture; neural network learning; neural network systems; optimization; steady state evolution; Computational modeling; Computer architecture; Computer science; Cost function; Electronic mail; Equations; Evolution (biology); Feedforward systems; Neural networks; Steady-state;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
ISSN
1098-7576
Print_ISBN
0-7803-8359-1
Type
conf
DOI
10.1109/IJCNN.2004.1380984
Filename
1380984
Link To Document