• DocumentCode
    2622983
  • Title

    Parameter sensitivity in the backpropagation learning algorithm

  • Author

    Manausa, Michael E. ; Lacher, R.C.

  • Author_Institution
    Dept. of Comput. Sci., Florida State Univ., Tallahassee, FL, USA
  • fYear
    1991
  • fDate
    18-21 Nov 1991
  • Firstpage
    390
  • Abstract
    The sensitivity of the backpropagation training algorithm to its learning rate and gain parameters is investigated. The authors report results from numerical experiments giving evidence for extreme sensitivity of training time, and even training success, to these parameters. A dynamic parameter update method that avoids the chaotic regime is derived. It is concluded that small changes of the gain parameter or the learning rate can greatly alter training time for a backpropagation network and even whether training is possible. Combined with evidence of similar sensitivity to initializations, these results show the impossibility of simple recipes for useful convergence criteria in backpropagation networks. Perhaps by optimizing all the variables that comprise the stepsize together, it will be easier to achieve a useful optimization strategy for the backpropagation algorithm
  • Keywords
    learning systems; neural nets; backpropagation learning algorithm; convergence; dynamic parameter update; gain parameters; learning rate; learning systems; neural nets; parameter sensitivity; Backpropagation algorithms; Chaos; Computer networks; Computer science; Convergence; Councils; Equations; Feedforward systems; Stationary state; Throughput;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1991. 1991 IEEE International Joint Conference on
  • Print_ISBN
    0-7803-0227-3
  • Type

    conf

  • DOI
    10.1109/IJCNN.1991.170433
  • Filename
    170433