• DocumentCode
    3135767
  • Title

    Dynamically pruning output weights in an expanding multilayer perceptron neural network

  • Author

    Amin, H. ; Curtis, K.M. ; Hayes Gill, B.R.

  • Author_Institution
    Dept. of Electr. & Electron. Eng., Nottingham Univ., UK
  • Volume
    2
  • fYear
    1997
  • fDate
    2-4 Jul 1997
  • Firstpage
    991
  • Abstract
    The network size for a multilayer perceptron neural network is often chosen arbitrarily for different applications, and the optimum size of the network is determined by a long process of trial and error. This paper presents a backpropagation algorithm. For a multilayer perceptron (MLP) neural network, that dynamically determines the optimum number of hidden nodes and applies a new pruning technique on output weights. A 29% reduction in the total number of output weights was observed for a handwritten character recognition problem using the new pruning algorithm
  • Keywords
    backpropagation; multilayer perceptrons; optimisation; MLP; backpropagation; dynamically pruning output weights; expanding multilayer perceptron neural network; handwritten character recognition problem; hidden nodes; optimum network size; output weight pruning; Backpropagation algorithms; Character recognition; Intelligent networks; Mean square error methods; Monitoring; Multi-layer neural network; Multilayer perceptrons; Neural networks; Neurons; Parallel processing;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Digital Signal Processing Proceedings, 1997. DSP 97., 1997 13th International Conference on
  • Conference_Location
    Santorini
  • Print_ISBN
    0-7803-4137-6
  • Type

    conf

  • DOI
    10.1109/ICDSP.1997.628530
  • Filename
    628530