• DocumentCode
    423630
  • Title

    Backpropagation-decorrelation: online recurrent learning with O(N) complexity

  • Author

    Steil, Jochen J.

  • Author_Institution
    Neuroinf. Group, Bielefeld Univ., Germany
  • Volume
    2
  • fYear
    2004
  • fDate
    25-29 July 2004
  • Firstpage
    843
  • Abstract
    We introduce a new learning rule for fully recurrent neural networks which we call backpropagation-decorrelation rule (BPDC). It combines important principles: one-step backpropagation of errors and the usage of temporal memory in the network dynamics by means of decorrelation of activations. The BPDC rule is derived and theoretically justified from regarding learning as a constraint optimization problem and applies uniformly in discrete and continuous time. It is very easy to implement, and has a minimal complexity of 2N multiplications per time-step in the single output case. Nevertheless we obtain fast tracking and excellent performance in some benchmark problems including the Mackey-Glass time-series.
  • Keywords
    backpropagation; computational complexity; constraint theory; decorrelation; optimisation; real-time systems; recurrent neural nets; time series; 2N multiplications; Mackey-Glass time series; O(N) complexity; backpropagation-decorrelation rule; benchmark problems; constraint optimization problem; continuous time; discrete time; network dynamics; online recurrent learning; recurrent neural networks; temporal memory; tracking; Adaptive control; Backpropagation algorithms; Biological system modeling; Constraint optimization; Decorrelation; Information processing; Neurons; Recurrent neural networks; Reservoirs; Speech recognition;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-8359-1
  • Type

    conf

  • DOI
    10.1109/IJCNN.2004.1380039
  • Filename
    1380039