DocumentCode :
2225298
Title :
General approximation theorem on feedforward networks
Author :
Huang, Guang-Bin ; Babri, Haroon A.
Author_Institution :
Sch. of Electr. & Electron. Eng., Nanyang Technol. Inst., Singapore
fYear :
1997
fDate :
9-12 Sep 1997
Firstpage :
698
Abstract :
We show that standard feedforward neural networks with as few as a single hidden layer and arbitrary bounded nonlinear (continuous or noncontinuous) activation functions which have two unequal limits in infinities can uniformly approximate (in contrast to approximate measurably) arbitrary bounded continuous mappings on Rn with any precision. Especially, in a compact set of Rn, standard feedforward neural networks with as few as a single hidden layer and arbitrary bounded nonlinear (continuous or noncontinuous) activation functions can uniformly approximate arbitrary continuous mappings with any precision. These results also hold for multi-hidden layer standard feedforward neural networks. We found that the boundedness and unequal limits at infinities conditions on the activation functions are sufficient, but not necessary
Keywords :
approximation theory; feedforward neural nets; transfer functions; arbitrary bounded continuous mappings; bounded nonlinear activation functions; continuous activation functions; feedforward neural networks; general approximation theorem; multiple hidden layers; noncontinuous activation functions; sufficient conditions; unequal limits; uniform approximation; Concrete; Convergence; Electric variables measurement; Extraterrestrial measurements; Feedforward neural networks; H infinity control; Measurement standards; Multi-layer neural network; Neural networks; Sufficient conditions;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information, Communications and Signal Processing, 1997. ICICS., Proceedings of 1997 International Conference on
Print_ISBN :
0-7803-3676-3
Type :
conf
DOI :
10.1109/ICICS.1997.652067
Filename :
652067
Link To Document :
بازگشت