• DocumentCode
    2701941
  • Title

    Can supervised learning be achieved without explicit error back-propagation?

  • Author

    Brandt, Robert D. ; Lin, Feng

  • Author_Institution
    Beckman Inst. for Adv. Sci. & Technol., Illinois Univ., Urbana, IL, USA
  • Volume
    1
  • fYear
    1996
  • fDate
    3-6 Jun 1996
  • Firstpage
    300
  • Abstract
    We propose a new model for the implementation of supervised learning algorithms for networks of sigmoidal "neurons" which does not require that error feedback be explicitly provided by means of a dedicated feedback network. In this model, a locally-defined environmental gradient which is implicit in the strengths of synapses, their rates of change, and pre- and post-synaptic activity levels is used in the adaptation. This environmental gradient always exists and is generally non-zero, independently of the presence of a supervisor, so long as there is some change in synaptic strength, regardless of the driving force behind the modification, much as a Hebbian gradient always exists at any synapse
  • Keywords
    learning (artificial intelligence); neural nets; Hebbian gradient; error feedback; locally-defined environmental gradient; post-synaptic activity levels; pre-synaptic activity levels; sigmoidal neurons; supervised learning; synaptic strength change; Artificial neural networks; Biological system modeling; Computer errors; Feedforward systems; Neural network hardware; Neurofeedback; Neurons; Neuroscience; Numerical models; Supervised learning;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1996., IEEE International Conference on
  • Conference_Location
    Washington, DC
  • Print_ISBN
    0-7803-3210-5
  • Type

    conf

  • DOI
    10.1109/ICNN.1996.548908
  • Filename
    548908