DocumentCode
325066
Title
Regularization effect of weight initialization in back propagation networks
Author
Cherkassky, Vladimir ; Shepherd, Robert
Author_Institution
Dept. of Electr. & Comput. Eng., Minnesota Univ., Minneapolis, MN, USA
Volume
3
fYear
1998
fDate
4-9 May 1998
Firstpage
2258
Abstract
Complexity control of a learning method is critical for obtaining good generalization with finite training data. We discuss complexity control in multilayer perceptron (MLP) networks trained via backpropagation. For such networks, the number of hidden units and/or network weights is usually used as a complexity parameter. However, application of backpropagation training introduces additional mechanisms for complexity control. These mechanisms are implicit in the implementation of an optimization procedure, and they cannot be easily quantified (in contrast to the number of weights or the number of hidden units). We suggest using the framework of statistical learning theory to explain the effect of weight initialization. Using this framework, we demonstrate the effect of weight initialization on complexity control in MLP networks
Keywords
backpropagation; computational complexity; generalisation (artificial intelligence); multilayer perceptrons; optimisation; statistical analysis; MLP networks; backpropagation networks; complexity control; finite training data; generalization; learning method; multilayer perceptron networks; regularization; statistical learning theory; weight initialization; Algorithm design and analysis; Backpropagation; Computer networks; Data engineering; Intelligent networks; Learning systems; Multilayer perceptrons; Predictive models; Statistical learning; Training data;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks Proceedings, 1998. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on
Conference_Location
Anchorage, AK
ISSN
1098-7576
Print_ISBN
0-7803-4859-1
Type
conf
DOI
10.1109/IJCNN.1998.687212
Filename
687212
Link To Document