• DocumentCode
    3189602
  • Title

    Gradient descent learning of radial basis neural networks

  • Author

    Karayiannis, Nicolaos B.

  • Author_Institution
    Dept. of Electr. & Comput. Eng., Houston Univ., TX, USA
  • Volume
    3
  • fYear
    1997
  • fDate
    9-12 Jun 1997
  • Firstpage
    1815
  • Abstract
    This paper presents an axiomatic approach for building RBF neural networks and also proposes a supervised learning algorithm based on gradient descent for their training. This approach results in a broad variety of admissible RBF models, including those employing Gaussian radial basis functions. The form of the radial basis functions is determined by a generator function. A sensitivity analysis explains the failure of gradient descent learning on RBF networks with Gaussian radial basis functions, which are generated by an exponential generator function. The same analysis verifies that RBF networks generated by a linear generator function are much more suitable for gradient descent learning. Experiments involving such RBF networks indicate that the proposed gradient descent algorithm guarantees fast learning and very satisfactory generalization ability
  • Keywords
    feedforward neural nets; generalisation (artificial intelligence); learning (artificial intelligence); Gaussian radial basis functions; exponential generator function; generalization ability; gradient descent learning; linear generator function; radial basis neural networks; sensitivity analysis; supervised learning algorithm; Clustering algorithms; Computer networks; Multidimensional systems; Neural networks; Prototypes; Radial basis function networks; Sensitivity analysis; Surface fitting; Training data; Vector quantization;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks,1997., International Conference on
  • Conference_Location
    Houston, TX
  • Print_ISBN
    0-7803-4122-8
  • Type

    conf

  • DOI
    10.1109/ICNN.1997.614174
  • Filename
    614174