• DocumentCode
    285218
  • Title

    An extended back-propagation learning algorithm by using heterogeneous processing units

  • Author

    Chen, Chih-Liang ; Nutter, Roy S.

  • Author_Institution
    Dept. of Electr. & Comput. Eng., West Virginia Univ., Morgantown, WV, USA
  • Volume
    3
  • fYear
    1992
  • fDate
    7-11 Jun 1992
  • Firstpage
    988
  • Abstract
    Based on the idea of using heterogeneous processing units (PUs) in a network, a variation of the backpropagation (BP) learning algorithm is presented. Three parameters, which are adjustable like connection weights, are incorporated into each PU to increase its autonomous capability by enhancing the output function. The extended BP learning algorithm thus is developed by updating the three parameters as well as connection weights. The extended BP is intended not only to improve the learning speed, but also to reduce the occurrence of local minima. The algorithm has been intensively tested on the XOR problem. By carefully choosing learning rates, results show that the extended BP appears to have advantages over the standard BP in terms of faster learning speed and fewer local minima
  • Keywords
    backpropagation; neural nets; XOR problem; connection weights; extended backpropagation learning algorithm; heterogeneous processing units; local minima; neural networks; Computer networks; Equations; Standards development; Testing; Upper bound;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1992. IJCNN., International Joint Conference on
  • Conference_Location
    Baltimore, MD
  • Print_ISBN
    0-7803-0559-0
  • Type

    conf

  • DOI
    10.1109/IJCNN.1992.227071
  • Filename
    227071