DocumentCode :
285219
Title :
A comparison of weight elimination methods for reducing complexity in neural networks
Author :
Hergert, F. ; Finnoff, W. ; Zimmermann, H.G.
Author_Institution :
Siemens AG, Munich, Germany
Volume :
3
fYear :
1992
fDate :
7-11 Jun 1992
Firstpage :
980
Abstract :
Three methods are examined for reducing complexity in potentially oversized networks. These consists of either removing redundant elements based on some measure of saliency, adding a further term to the cost function penalizing complexity, or observing the error on a further, validation set of examples, and then stopping training as soon as this performance begins to deteriorate. It was demonstrated on a series of simulation examples that all of these methods can significantly improve generalization, but their performance can prove to be domain dependent
Keywords :
computational complexity; generalisation (artificial intelligence); neural nets; cost function; domain dependent; generalization; neural networks; reducing complexity; redundant elements; weight elimination methods; Cost function; Intelligent networks; Neural networks; Noise measurement; Pressing; Research and development; Size measurement; Stochastic processes; Stochastic resonance; Testing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1992. IJCNN., International Joint Conference on
Conference_Location :
Baltimore, MD
Print_ISBN :
0-7803-0559-0
Type :
conf
DOI :
10.1109/IJCNN.1992.227072
Filename :
227072
Link To Document :
بازگشت