• DocumentCode
    2644528
  • Title

    Improving the training speed of three-layer feedforward neural nets by optimal estimation of the initial weights

  • Author

    Chen, Chih-Liang ; Nutter, Roy S.

  • Author_Institution
    Dept. of Electr. & Comput. Eng., West Virginia Univ., Morgantown, WV, USA
  • fYear
    1991
  • fDate
    18-21 Nov 1991
  • Firstpage
    2063
  • Abstract
    The authors formulate the training problem for three-layer feedforward neural nets based on the well known linear algebra of D. Rumelhart et al. (1986). Then, they develop two estimation algorithms, called the forward estimation algorithm and the recurrent estimation algorithm, to estimate the initial weights. The basic idea is to set the initial weights space as close as possible to a global minimum before training, consequently reducing the training time. It is theoretically and empirically shown that a training procedure is unnecessary if the number of hidden units is equal to or greater than the number of training patterns minus one. Simulations were conducted for several problems. Results showed that the training speed is significantly improved by both initial weight estimation algorithms
  • Keywords
    learning systems; linear algebra; neural nets; optimisation; parallel algorithms; forward estimation algorithm; global minimum; learning systems; linear algebra; recurrent estimation algorithm; three-layer feedforward neural nets; weights space; Bismuth; Feedforward neural networks; Feedforward systems; Linear algebra; Neural networks; Random processes;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1991. 1991 IEEE International Joint Conference on
  • Print_ISBN
    0-7803-0227-3
  • Type

    conf

  • DOI
    10.1109/IJCNN.1991.170691
  • Filename
    170691