• DocumentCode
    285272
  • Title

    Convergence of recurrent networks as contraction mappings

  • Author

    Steck, James Edward

  • Author_Institution
    Dept. of Mech. Eng., Wichita State Univ., KS, USA
  • Volume
    3
  • fYear
    1992
  • fDate
    7-11 Jun 1992
  • Firstpage
    462
  • Abstract
    Three theorems are presented which establish an upper bound on the magnitude of the weights which guarantees convergence of the network to a stable unique fixed point. It is shown that the bound on the weights is inversely proportional to the product of the number of neurons in the network and the maximum slope of the neuron activation functions. The location of its fixed point is determined by the network architecture, weights, and the external input values. The proofs are constructive, consisting of representing the network as a contraction mapping and then applying the contraction mapping theorem from point set topology. The resulting sufficient conditions for network stability are shown to be general enough to allow the network to have nontrivial fixed points
  • Keywords
    convergence; recurrent neural nets; topology; contraction mappings; network stability; neuron activation functions; nontrivial fixed points; point set topology; recurrent networks; sufficient conditions; upper bound; Artificial neural networks; Backpropagation; Convergence; Mechanical engineering; Network topology; Neurons; Optimization methods; Pattern recognition; Stability; Sufficient conditions;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1992. IJCNN., International Joint Conference on
  • Conference_Location
    Baltimore, MD
  • Print_ISBN
    0-7803-0559-0
  • Type

    conf

  • DOI
    10.1109/IJCNN.1992.227131
  • Filename
    227131