• DocumentCode
    1400449
  • Title

    Nonlinear backpropagation: doing backpropagation without derivatives of the activation function

  • Author

    Hertz, John ; Krogh, Anders ; Lautrup, Benny ; Lehmann, Torsten

  • Author_Institution
    Nordita, Copenhagen, Denmark
  • Volume
    8
  • Issue
    6
  • fYear
    1997
  • fDate
    11/1/1997 12:00:00 AM
  • Firstpage
    1321
  • Lastpage
    1327
  • Abstract
    The conventional linear backpropagation algorithm is replaced by a nonlinear version, which avoids the necessity for calculating the derivative of the activation function. This may be exploited in hardware realizations of neural processors. In this paper we derive the nonlinear backpropagation algorithms in the framework of recurrent backpropagation and present some numerical simulations of feedforward networks on the NetTalk problem. A discussion of implementation in analog very large scale integration (VLSI) electronics concludes the paper
  • Keywords
    VLSI; analogue integrated circuits; backpropagation; feedforward neural nets; neural chips; nonlinear network synthesis; recurrent neural nets; transfer functions; NetTalk problem; activation function; analog VLSI; feedforward networks; neural processors; nonlinear backpropagation; nonlinear gradient descent; recurrent backpropagation; Backpropagation algorithms; Biomedical optical imaging; Equations; Hardware; Neural networks; Neurons; Numerical simulation; Read only memory; Recurrent neural networks; Very large scale integration;
  • fLanguage
    English
  • Journal_Title
    Neural Networks, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1045-9227
  • Type

    jour

  • DOI
    10.1109/72.641455
  • Filename
    641455