• DocumentCode
    285199
  • Title

    A convenient method to prune multilayer neural networks via transform domain backpropagation algorithm

  • Author

    Yang, Xiahua

  • Author_Institution
    Dept. of Electron. Eng., Jiao Tong Univ., Shanghai, China
  • Volume
    3
  • fYear
    1992
  • fDate
    7-11 Jun 1992
  • Firstpage
    817
  • Abstract
    It is proved that the transform domain backpropagation (BP) algorithm with a variable learning rate is an effective algorithm for accelerating the convergence of a multilayer neural network. It is shown that the transform domain BP algorithm can also be applied to prune neural networks conveniently and to accelerate the convergence to some extent. This is based on the fact the correlation within the input pattern of every layer can be removed via an orthogonal transform
  • Keywords
    backpropagation; neural nets; correlation; multilayer neural networks; pruning; transform domain backpropagation; variable learning rate; Acceleration; Algorithm design and analysis; Backpropagation algorithms; Convergence; Discrete transforms; Frequency; Multi-layer neural network; Neural networks; Neurons;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1992. IJCNN., International Joint Conference on
  • Conference_Location
    Baltimore, MD
  • Print_ISBN
    0-7803-0559-0
  • Type

    conf

  • DOI
    10.1109/IJCNN.1992.227051
  • Filename
    227051