DocumentCode :
2863630
Title :
Learning algorithms for reformulated radial basis neural networks
Author :
Karayiannis, Nicolaos B.
Author_Institution :
Dept. of Electr. & Comput. Eng., Houston Univ., TX, USA
Volume :
3
fYear :
1998
fDate :
4-9 May 1998
Firstpage :
2230
Abstract :
This paper proposes supervised learning algorithms based on gradient descent for training reformulated radial basis function (RBF) neural networks. Such RBF models employ radial basis functions whose form is determined by admissible generator functions. RBF networks with Gaussian radial basis functions are generated by exponential generator functions. A sensitivity analysis provides the basis for selecting generator functions by investigating the effect of linear, exponential and logarithmic generator functions on gradient descent learning. Experiments involving reformulated RBF networks indicate that the proposed gradient descent algorithms guarantee fast learning and very satisfactory function approximation capability
Keywords :
conjugate gradient methods; feedforward neural nets; learning (artificial intelligence); sensitivity analysis; Gaussian radial basis functions; RBF neural networks; admissible generator functions; exponential generator functions; fast learning; function approximation capability; gradient descent learning; linear generator functions; logarithmic generator functions; reformulated radial basis neural networks; sensitivity analysis; supervised learning algorithms; Approximation algorithms; Clustering algorithms; Computer networks; Function approximation; Neural networks; Prototypes; Radial basis function networks; Shape; Supervised learning; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks Proceedings, 1998. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on
Conference_Location :
Anchorage, AK
ISSN :
1098-7576
Print_ISBN :
0-7803-4859-1
Type :
conf
DOI :
10.1109/IJCNN.1998.687207
Filename :
687207
Link To Document :
بازگشت