Title :
Optimal training parameters in multilayer feedforward networks
Author :
Wendemuth, Andreas ; Gerke, Michael
Author_Institution :
Univ. of Hagen, Germany
Abstract :
We present a systematic investigation of the training behavior for multilayer feedforward neural networks. Usually learning is governed by three metaparameters, which are learning rate, momentum and offset. We apply a (nearly) exhaustive search method to find optimal parameter sets throughout the complete sequence of training cycles, regarding the training process as a finite state network in the space of metaparameter configurations. Minimization of training time is achieved by methods of dynamic programming. A detailed analysis is given for the choice of error criteria and necessary widths and prunings of network `beams´ in search space. It is shown for a representative set of training patterns, that the number of network training iterations is largely independent of both, the metaparameter initialization and the random weight initialization. Training is twice as fast as with conventional metaparameter adaptation strategies, such as RPROP or local fuzzy inference
Keywords :
dynamic programming; feedforward neural nets; learning (artificial intelligence); minimisation; multilayer perceptrons; RPROP; error criteria; exhaustive search method; finite state network; local fuzzy inference; metaparameter adaptation strategies; metaparameter configurations; multilayer feedforward networks; network training iterations; optimal parameter sets; optimal training parameters; training cycles; Backpropagation; Dynamic programming; Feedforward neural networks; Intelligent networks; Minimization methods; Multi-layer neural network; Neural networks; Nonhomogeneous media; Search methods; Transfer functions;
Conference_Titel :
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-5529-6
DOI :
10.1109/IJCNN.1999.832667