DocumentCode :
2652169
Title :
Networks of exponential neurons for multivariate function approximation
Author :
Geva, Shlomo ; Sitte, Joaquin
Author_Institution :
Fac. of Inf. Technol., Queensland Univ. of Technol., Brisbane, Qld., Australia
fYear :
1991
fDate :
18-21 Nov 1991
Firstpage :
2305
Abstract :
A three-layer neural network, having a hidden layer of neurons with an exponential transfer function, capable of performing function approximation more accurately, and more economically, than a conventional multilayer perceptron (MLP) having neurons with a sigmoidal transfer function, is described. The network was trained by a variation of the standard backpropagation gradient-descent technique. The results of a difficult approximation problem, where a conventional MLP of similar size simply fails to perform within reasonable constraints on training time, are shown graphically
Keywords :
function approximation; learning systems; neural nets; transfer functions; backpropagation gradient-descent technique; exponential neurons; exponential transfer function; hidden layer; learning systems; multivariate function approximation; three-layer neural network; Feeds; Function approximation; Gaussian processes; Neural networks; Neurons; Pattern classification; Piecewise linear approximation; Surface fitting; Transfer functions; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
Type :
conf
DOI :
10.1109/IJCNN.1991.170732
Filename :
170732
Link To Document :
بازگشت