• DocumentCode
    329058
  • Title

    Initializing weights to a hidden layer of a multilayer neural network by linear programming

  • Author

    Kim, Lark Sang

  • Author_Institution
    CAD Cente, Samsung Electron. Co. Ltd., Suwon, South Korea
  • Volume
    2
  • fYear
    1993
  • fDate
    25-29 Oct. 1993
  • Firstpage
    1701
  • Abstract
    This paper presents a new method for initializing weights to a hidden layer of a backpropagation net. The proposed method initializes weights to hidden units using linear programming such that the backpropagation starts with linear hyperplanes around the input patterns rather than the whole input space. Experimental results show that the method generates the hidden layer much more efficiently and train the net faster than the ordinary backpropagation method.
  • Keywords
    backpropagation; feedforward neural nets; linear programming; backpropagation; hidden layer; linear hyperplanes; linear programming; multilayer neural network; weight initialisation; Acceleration; Lakes; Linear programming; Multi-layer neural network; Neural networks; Piecewise linear techniques; Strips;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
  • Print_ISBN
    0-7803-1421-2
  • Type

    conf

  • DOI
    10.1109/IJCNN.1993.716981
  • Filename
    716981