• DocumentCode
    1013044
  • Title

    On global-local artificial neural networks for function approximation

  • Author

    Wedge, D. ; Ingram, David ; McLean, D. ; Bandar, Zuhair

  • Author_Institution
    Silent Talker, Manchester Metropolitan Univ., UK
  • Volume
    17
  • Issue
    4
  • fYear
    2006
  • fDate
    7/1/2006 12:00:00 AM
  • Firstpage
    942
  • Lastpage
    952
  • Abstract
    We present a hybrid radial basis function (RBF) sigmoid neural network with a three-step training algorithm that utilizes both global search and gradient descent training. The algorithm used is intended to identify global features of an input-output relationship before adding local detail to the approximating function. It aims to achieve efficient function approximation through the separate identification of aspects of a relationship that are expressed universally from those that vary only within particular regions of the input space. We test the effectiveness of our method using five regression tasks; four use synthetic datasets while the last problem uses real-world data on the wave overtopping of seawalls. It is shown that the hybrid architecture is often superior to architectures containing neurons of a single type in several ways: lower mean square errors are often achievable using fewer hidden neurons and with less need for regularization. Our global-local artificial neural network (GL-ANN) is also seen to compare favorably with both perceptron radial basis net and regression tree derived RBFs. A number of issues concerning the training of GL-ANNs are discussed: the use of regularization, the inclusion of a gradient descent optimization step, the choice of RBF spreads, model selection, and the development of appropriate stopping criteria.
  • Keywords
    function approximation; gradient methods; learning (artificial intelligence); mean square error methods; optimisation; radial basis function networks; regression analysis; search problems; trees (mathematics); RBF spreads; appropriate stopping criteria; function approximation; global search training; global-local artificial neural networks; gradient descent optimization step; gradient descent training; hidden neurons; hybrid radial basis function sigmoid neural network; input-output relationship; mean square errors; model selection; perceptron radial basis net; regression tasks; regression tree derived RBF; synthetic datasets; three-step training algorithm; wave seawall overtopping; Acoustic noise; Approximation algorithms; Artificial neural networks; Function approximation; Mathematical analysis; Mathematics; Mean square error methods; Neurons; Regression tree analysis; Testing; Global; hybrid; local; overtopping; regularization;
  • fLanguage
    English
  • Journal_Title
    Neural Networks, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1045-9227
  • Type

    jour

  • DOI
    10.1109/TNN.2006.875972
  • Filename
    1650249