• DocumentCode
    1354757
  • Title

    Neural network architectures and learning algorithms

  • Author

    Wilamowski, Bogdan M.

  • Author_Institution
    Technology Center at Auburn University
  • Volume
    3
  • Issue
    4
  • fYear
    2009
  • Firstpage
    56
  • Lastpage
    63
  • Abstract
    Neural networks are the topic of this paper. Neural networks are very powerful as nonlinear signal processors, but obtained results are often far from satisfactory. The purpose of this article is to evaluate the reasons for these frustrations and show how to make these neural networks successful. The following are the main challenges of neural network applications: (1) Which neural network architectures should be used? (2) How large should a neural network be? (3) Which learning algorithms are most suitable? The multilayer perceptron (MLP) architecture is unfortunately the preferred neural network topology of most researchers. It is the oldest neural network architecture, and it is compatible with all training softwares. However, the MLP topology is less powerful than other topologies such as bridged multilayer perceptron (BMLP), where connections across layers are allowed. The error-back propagation (EBP) algorithm is the most popular learning algorithm, but it is very slow and seldom gives adequate results. The EBP training process requires 100-1,000 times more iterations than the more advanced algorithms such as Levenberg-Marquardt (LM) or neuron by neuron (NBN) algorithms. What is most important is that the EBP algorithm is not only slow but often it is not able to find solutions for close-to-optimum neural networks. The paper describes and compares several learning algorithms.
  • Keywords
    learning (artificial intelligence); neural net architecture; EBP training process; Levenberg-Marquardt algorithm; bridged multilayer perceptron; close-to-optimum neural networks; error backpropagation algorithm; learning algorithms; multilayer perceptron architecture; neural network topology; neuron-by-neuron algorithm; Circuit topology; Convergence; Multi-layer neural network; Multilayer perceptrons; Network topology; Neural networks; Neurons; Signal processing; Signal processing algorithms; Support vector machines;
  • fLanguage
    English
  • Journal_Title
    Industrial Electronics Magazine, IEEE
  • Publisher
    ieee
  • ISSN
    1932-4529
  • Type

    jour

  • DOI
    10.1109/MIE.2009.934790
  • Filename
    5352485