• DocumentCode
    423629
  • Title

    A comparison of first- and second-order training algorithms for dynamic neural networks

  • Author

    Bajramovic, Fend ; Gruber, Christian ; Sick, Bemhard

  • Author_Institution
    Inst. for Comput. Archit., Passau Univ., Germany
  • Volume
    2
  • fYear
    2004
  • fDate
    25-29 July 2004
  • Firstpage
    837
  • Abstract
    Neural networks are often used for time series processing. Temporal information can, for example, be modeled using the dynamic neural network (DYNN) paradigm which combines delay-elements in feedforward direction with recurrent connections. If networks were applied to process time series, learning typically becomes a very complex and time consuming task. Therefore, training algorithms are needed which are both accurate and fast. In this article six learning algorithms, namely temporal backpropagation through time, resilient propagation, Quasi-Newton, scaled conjugate gradient, backpropagation based on partial Quasi-Newton, and a combination of the latter two algorithms, are presented and applied to DYNN. The various learning algorithms are compared with respect to duration of training, approximation and generalization capability, and convergence speed. Each algorithm is evaluated by means of three real-world application examples, the prediction of the number of users in a computer pool, the prediction of the energy consumption in a building, and the verification of a person using her/his signature. With respect to the experiments conducted here, RProp turns out to be the best algorithm to train DYNN.
  • Keywords
    Newton method; approximation theory; backpropagation; conjugate gradient methods; convergence of numerical methods; feedforward neural nets; generalisation (artificial intelligence); recurrent neural nets; time series; approximation theory; computer pool; convergence speed; dynamic neural networks; energy consumption; feedforward direction; first order training algorithm; generalization; learning algorithms; partial Quasi-Newton algorithm; recurrent connection mechanism; resilient propagation; scaled conjugate gradient algorithm; second order training algorithm; temporal backpropagation through time; temporal information; time series processing; Application software; Approximation algorithms; Backpropagation algorithms; Computer architecture; Convergence; Electronic mail; Feedforward neural networks; Heuristic algorithms; Neural networks; Recurrent neural networks;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-8359-1
  • Type

    conf

  • DOI
    10.1109/IJCNN.2004.1380038
  • Filename
    1380038