• DocumentCode
    274194
  • Title

    On the use of predefined regions to minimize the training and complexity of multilayer neural networks

  • Author

    Houselander, P.K. ; Taylor, J.T.

  • Author_Institution
    Univ. Coll. London, UK
  • fYear
    1989
  • fDate
    16-18 Oct 1989
  • Firstpage
    383
  • Lastpage
    386
  • Abstract
    The flexibility of the multilayer perceptron (MLP) in conjunction with the back-propagation training algorithm has been instrumental in the resurgence of interest in the study of artificial neural networks. The MLP can map any distributed set of arbitrarily shaped regions, and therefore functions as a universal mapping network. However, it is demonstrated that the flexibility of the design caused a penalty in terms of training time. The MLP is compared with two other networks that contain predefined regions, the regions being either an ellipsoid or a sphere. The three networks had the same number of adjustable parameters to enable a fair comparison and each of the three networks was trained using two test patterns. In both tests the MLP performed poorly, calling into question its usefulness in a practical environment. In contrast the other networks performed well, acknowledging the presence of disjoint regions early in the training sequence, with the network based on ellipsoids performing marginally better than the one based on spheres. It is concluded that the radial basis function with ellipsoids gives a better performance than the MLP
  • Keywords
    learning systems; neural nets; artificial neural networks; back-propagation training algorithm; complexity minimisation; disjoint regions; ellipsoids; multilayer neural networks; multilayer perceptron; predefined regions; radial basis function; universal mapping network;
  • fLanguage
    English
  • Publisher
    iet
  • Conference_Titel
    Artificial Neural Networks, 1989., First IEE International Conference on (Conf. Publ. No. 313)
  • Conference_Location
    London
  • Type

    conf

  • Filename
    51998