• DocumentCode
    2698776
  • Title

    Fixed-weight networks can learn

  • Author

    Cotter, Neil E. ; Conwell, Peter R.

  • fYear
    1990
  • fDate
    17-21 June 1990
  • Firstpage
    553
  • Abstract
    A theorem describing how fixed-weight recurrent neural networks can approximate adaptive-weight learning algorithms is proved. The theorem applies to most networks and learning algorithms currently in use. It is concluded from the theorem that a system which exhibits learning behavior may exhibit no synaptic weight modifications. This idea is demonstrated by transforming a backward error propagation network into a fixed-weight system
  • Keywords
    learning systems; neural nets; adaptive-weight learning algorithms; backward error propagation network; error backpropagation network; fixed-weight recurrent neural networks; synaptic weight modifications;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1990., 1990 IJCNN International Joint Conference on
  • Conference_Location
    San Diego, CA, USA
  • Type

    conf

  • DOI
    10.1109/IJCNN.1990.137898
  • Filename
    5726856