• DocumentCode
    288594
  • Title

    Backpropagation without weight transport

  • Author

    Kolen, John F. ; Pollack, Jordan B.

  • Author_Institution
    Lab. for Artificial Intelligence Res., Ohio State Univ., Columbus, OH, USA
  • Volume
    3
  • fYear
    1994
  • fDate
    27 Jun-2 Jul 1994
  • Firstpage
    1375
  • Abstract
    In backpropagation, connection weights are used to both compute node activations and error gradient for hidden units. Grossberg (1987) has argued that the dual use of the same synaptic connections (weight transport) constitutes a bidirectional flow of information through synapses, which is biologically implausable. In this paper we formally and empirically demonstrate the feasibility of an architecture equivalent to backpropagation, but without the assumption of weight transport. Through coordinated training with weight decay, a reciprocal layer of weights evolves into a copy of the forward connections and acts as the conduit for backward flowing corrective information. Examination of the networks trained with dual weights suggests that functional synchronization, and not weight synchronization, is crucial to the operation of backpropagation methods
  • Keywords
    backpropagation; neural net architecture; neural nets; synchronisation; backpropagation; backward flowing corrective information; connection weights; coordinated training; error gradient; functional synchronization; hidden units; node activations; synaptic connections; weight decay; Backpropagation;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
  • Conference_Location
    Orlando, FL
  • Print_ISBN
    0-7803-1901-X
  • Type

    conf

  • DOI
    10.1109/ICNN.1994.374486
  • Filename
    374486