• DocumentCode
    311201
  • Title

    A neural network training algorithm utilizing multiple sets of linear equations

  • Author

    Chen, Hung-Han ; Manry, Michael T.

  • Author_Institution
    Dept. of Electr. Eng., Texas Univ., Arlington, TX, USA
  • fYear
    1996
  • fDate
    3-6 Nov. 1996
  • Firstpage
    1166
  • Abstract
    A fast algorithm is presented for the training of multilayer perceptron neural networks. In each iteration, there are two passes through the training data. In the first pass, linear equations are solved for the output weights. In the second data pass, linear equations are solved for hidden unit weight changes. Full batching is used in both data passes. An algorithm is described for calculating the learning factor for use with the hidden weights. It is shown that the technique is significantly faster than standard output weight optimization-backpropagation.
  • Keywords
    learning (artificial intelligence); multilayer perceptrons; batching; data passes; fast algorithm; hidden unit weight; learning factor; linear equations; multilayer perceptron neural networks; neural network training algorithm; output weight optimization-backpropagation; output weights; surface scattering parameters; training data; Additives; Backpropagation algorithms; Equations; Feeds; Joining processes; Mean square error methods; Multi-layer neural network; Multilayer perceptrons; Neural networks; Training data;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Signals, Systems and Computers, 1996. Conference Record of the Thirtieth Asilomar Conference on
  • Conference_Location
    Pacific Grove, CA, USA
  • ISSN
    1058-6393
  • Print_ISBN
    0-8186-7646-9
  • Type

    conf

  • DOI
    10.1109/ACSSC.1996.599128
  • Filename
    599128