• DocumentCode
    3565867
  • Title

    Training self-configuring backpropagation networks

  • Author

    Bryant, Garnett W.

  • Author_Institution
    Harry Diamond Labs., Adelphi, MD, USA
  • Volume
    1
  • fYear
    1992
  • Firstpage
    365
  • Abstract
    A generalization of the generalized delta rule (GDR) for error backpropagation in feedforward networks is presented. The generalization applies to networks with any topology of connections between nodes and an activation function at each node that is any differentiable function of its inputs and of the parameters to be adjusted by error backpropagation. The generalized GDR is used to implement networks in which the strength (magnitude of the activation function) of each node is adjusted during training. These networks are trained by minimizing the training error plus cost functions for the node strengths. Self-configuring networks are implemented by use of cost functions which have minimum cost when the node is off (zero magnitude) or on (unit magnitude). Simulations show that a self-configuring network with all nodes initially inactive turns on nodes one by one during training. Simulations show that a self-configuring network can also deactivate nodes during training. Methods for training self-configuring networks are discussed
  • Keywords
    backpropagation; feedforward neural nets; knowledge based systems; activation function; cost functions; feedforward networks; generalized delta rule; self-configuring backpropagation networks training; training error; Backpropagation; Cost function; Feedforward neural networks; Feeds; Laboratories; Network topology; Neural networks; Physics; Training data;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1992. IJCNN., International Joint Conference on
  • Print_ISBN
    0-7803-0559-0
  • Type

    conf

  • DOI
    10.1109/IJCNN.1992.287184
  • Filename
    287184