Title :
Improving generalization by using genetic algorithms to determine the neural network size
Author :
Bebis, George ; Georgiopoulos, Michael
Author_Institution :
Dept. of Electr. & Comput. Eng., Univ. of Central Florida, Orlando, FL, USA
Abstract :
Recent theoretical results support that decreasing the number of free parameters in a neural network (i.e., weights) can improve generalization. The importance of these results has triggered the development of many approaches which try to determine an “appropriate” network size for a given problem. Although it has been demonstrated that most of the approaches manage to find small size networks which solve the problem at hand, it is quite remarkable that the generalization capabilities of these networks have not been explored thoroughly. In this paper, we propose the coupling of genetic algorithms and weight pruning with the objective of both reducing network size and improving generalization. The innovation of our approach relies on the use of a fitness function which uses an adaptive parameter to encourage the reproduction of networks having good generalization performance and a relatively small size
Keywords :
feedforward neural nets; generalisation (artificial intelligence); genetic algorithms; adaptive parameter; fitness function; free parameters; generalization; genetic algorithms; ionosphere database; layered feedforward neural networks; neural network size; numbers database; weight pruning; Computer networks; Convergence; Electronic mail; Error correction; Feedforward neural networks; Genetic algorithms; Neural networks; Technological innovation; Training data;
Conference_Titel :
Southcon/95. Conference Record
Conference_Location :
Fort Lauderdale, FL
Print_ISBN :
0-7803-2576-1
DOI :
10.1109/SOUTHC.1995.516136