• DocumentCode
    352974
  • Title

    Sufficient conditions for error back flow convergence in dynamical recurrent neural networks

  • Author

    Aussem, Alex

  • Author_Institution
    Univ. Blaise Pascal, Aubiere, France
  • Volume
    4
  • fYear
    2000
  • fDate
    2000
  • Firstpage
    577
  • Abstract
    This paper extends previous analysis of the gradient decay to a class of discrete-time fully recurrent networks, called dynamical recurrent neural networks, obtained by modelling synapses as finite impulse response (FIR) filters instead of multiplicative scalars. Using elementary matrix manipulations, we provide an upper bound on the norm of the weight matrix ensuring that the gradient vector, when propagated in a reverse manner in time through the error-propagation network, decays exponentially to zero. This bounds apply to all FIR architecture proposals as well as fixed point recurrent networks, regardless of delay and connectivity. In addition, we show that the computational overhead of the learning algorithm can be reduced drastically by taking advantage of the exponential decay of the gradient
  • Keywords
    FIR filters; backpropagation; convergence of numerical methods; gradient methods; matrix algebra; recurrent neural nets; FIR filters; dynamical recurrent neural networks; error back flow convergence; error-backpropagation; forgetting behaviour; gradient vector; learning algorithm; matrix algebra; upper bound; Convergence; Delay effects; Differential equations; Electronic mail; Finite impulse response filter; Intelligent networks; Proposals; Recurrent neural networks; Sufficient conditions; Upper bound;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
  • Conference_Location
    Como
  • ISSN
    1098-7576
  • Print_ISBN
    0-7695-0619-4
  • Type

    conf

  • DOI
    10.1109/IJCNN.2000.860833
  • Filename
    860833