• DocumentCode
    2109758
  • Title

    Learning laws with exponential error convergence for recurrent neural networks

  • Author

    Kosmatopoulos, Elias B. ; Christdoulou, M.A. ; Ioannou, Peters A.

  • Author_Institution
    Dept. of Electron. & Comput. Eng., Tech. Univ. of Crete, Chania, Greece
  • fYear
    1993
  • fDate
    15-17 Dec 1993
  • Firstpage
    2810
  • Abstract
    In this paper, we propose new learning laws for adjusting the weights of recurrent high order neural networks (RHONN) when they are used to system identification problems. The main advantages of these learning laws over the classical robust adaptive ones, is that the identification error converges to zero exponentially fast, and that such a convergence is independent of the number of high order connections of the RHONN
  • Keywords
    convergence of numerical methods; error analysis; identification; learning (artificial intelligence); nonlinear systems; recurrent neural nets; exponential error convergence; learning laws; nonlinear systems; recurrent neural networks; system identification; Adaptive algorithm; Adaptive control; Convergence; Lyapunov method; Neural networks; Neurons; Parameter estimation; Programmable control; Recurrent neural networks; Robustness;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Decision and Control, 1993., Proceedings of the 32nd IEEE Conference on
  • Conference_Location
    San Antonio, TX
  • Print_ISBN
    0-7803-1298-8
  • Type

    conf

  • DOI
    10.1109/CDC.1993.325707
  • Filename
    325707