• DocumentCode
    2821783
  • Title

    Opposite Transfer Functions and Backpropagation Through Time

  • Author

    Ventresca, Mario ; Tizhoosh, Hamid R.

  • Author_Institution
    Syst. Design Eng. Dept., Waterloo Univ., Ont.
  • fYear
    2007
  • fDate
    1-5 April 2007
  • Firstpage
    570
  • Lastpage
    577
  • Abstract
    Backpropagation through time is a very popular discrete-time recurrent neural network training algorithm. However, the computational time associated with the learning process to achieve high accuracy is high. While many approaches have been proposed that alter the learning algorithm, this paper presents a computationally inexpensive method based on the concept of opposite transfer functions to improve learning in the backpropagation through time algorithm. Specifically, we will show an improvement in the accuracy, stability as well as an acceleration in learning time. We will utilize three common benchmarks to provide experimental evidence of the improvements
  • Keywords
    backpropagation; recurrent neural nets; transfer functions; backpropagation through time; discrete-time recurrent neural network training; opposite transfer functions; Acceleration; Backpropagation algorithms; Computational complexity; Computational intelligence; Convergence; Nonlinear dynamical systems; Recurrent neural networks; Signal processing algorithms; Stability; Transfer functions; Backpropagation through time; opposite transfer functions; opposition-based learning;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Foundations of Computational Intelligence, 2007. FOCI 2007. IEEE Symposium on
  • Conference_Location
    Honolulu, HI
  • Print_ISBN
    1-4244-0703-6
  • Type

    conf

  • DOI
    10.1109/FOCI.2007.371529
  • Filename
    4233963