DocumentCode
2895243
Title
Characterization of Degree of Approximation for Neural Networks with One Hidden Layer
Author
Cao, Fei-Long ; Xu, Zong-Ben ; He, Man-xi
Author_Institution
Dept. of Inf. & Math. Sci., China Jiliang Univ., Zhejiang
fYear
2006
fDate
13-16 Aug. 2006
Firstpage
2944
Lastpage
2947
Abstract
There have been various studies on approximation ability of feedforward neural networks (FNNs). Most of the existing studies are, however, only concerned with density or upper bound estimation on how a function can be approximated by an FNN, and consequently, the essential approximation ability of an FNN can not been revealed. In this paper, by establishing both upper and lower bound estimations on degree of approximation, the essential approximation ability of a class of FNNs is clarified in terms of the modulus of smoothness of functions to be approximated. The involved FNNs can not only approximate any continuous functions arbitrarily well, but also provide an explicit lower bound on number of hidden units required. By making use of approximation tools, it is shown that when the functions to be approximated are Lipschitzian, the approximation speed of the FNNs is determined by modulus of smoothness of the functions
Keywords
approximation theory; feedforward neural nets; approximation ability; bound estimation; feedforward neural network; Approximation error; Approximation methods; Artificial neural networks; Cybernetics; Educational institutions; Electronic mail; Feedforward neural networks; Helium; Machine learning; Mathematics; Neural networks; Upper bound; Neural networks; approximation error; approximation order;
fLanguage
English
Publisher
ieee
Conference_Titel
Machine Learning and Cybernetics, 2006 International Conference on
Conference_Location
Dalian, China
Print_ISBN
1-4244-0061-9
Type
conf
DOI
10.1109/ICMLC.2006.259143
Filename
4028566
Link To Document