• DocumentCode
    892669
  • Title

    A generalized convergence theorem for neural networks

  • Author

    Bruck, Jehoshua ; Goodman, Joseph W.

  • Author_Institution
    Dept. of Electr. Eng., Stanford Univ., CA, USA
  • Volume
    34
  • Issue
    5
  • fYear
    1988
  • fDate
    9/1/1988 12:00:00 AM
  • Firstpage
    1089
  • Lastpage
    1092
  • Abstract
    A neural network model is presented in which each neuron performs a threshold logic function. The model always converges to a stable state when operating in a serial mode and to a cycle of length at most 2 when operating in a fully parallel mode. This property is the basis for the potential applications of the model, such as associative memory devices and combinatorial optimization. The two convergence theorems (for serial and fully parallel modes of operation) are reviewed, and a general convergence theorem is presented that unifies the two known cases. New relations between the neural network model and the problem of finding a minimum cut in a graph are obtained
  • Keywords
    combinatorial switching; convergence; neural nets; optimisation; associative memory devices; combinatorial optimisation; convergence theorems; fully parallel mode; neural networks; neuron; serial mode; stable state; threshold logic function; Associative memory; Computer networks; Convergence; Logic functions; Military computing; Multidimensional systems; Neural networks; Neurons; Performance evaluation; Symmetric matrices;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/18.21239
  • Filename
    21239