Title :
Initializing weights to a hidden layer of a multilayer neural network by linear programming
Author_Institution :
CAD Cente, Samsung Electron. Co. Ltd., Suwon, South Korea
Abstract :
This paper presents a new method for initializing weights to a hidden layer of a backpropagation net. The proposed method initializes weights to hidden units using linear programming such that the backpropagation starts with linear hyperplanes around the input patterns rather than the whole input space. Experimental results show that the method generates the hidden layer much more efficiently and train the net faster than the ordinary backpropagation method.
Keywords :
backpropagation; feedforward neural nets; linear programming; backpropagation; hidden layer; linear hyperplanes; linear programming; multilayer neural network; weight initialisation; Acceleration; Lakes; Linear programming; Multi-layer neural network; Neural networks; Piecewise linear techniques; Strips;
Conference_Titel :
Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
Print_ISBN :
0-7803-1421-2
DOI :
10.1109/IJCNN.1993.716981