DocumentCode :
2259601
Title :
Convex geometry and nonlinear approximation
Author :
Kainen, Paul C.
Author_Institution :
Dept. of Math., Georgetown Univ., Washington, DC, USA
Volume :
1
fYear :
2000
fDate :
2000
Firstpage :
299
Abstract :
A variety of properties for neural approximation follow from considerations of convexity. For example, if n and d are positive integers and X=Lp([0,1]d) (with 1<p<∞) and if ε is any given positive constant, no matter how large, then it is not possible to have a continuous function φ which associates to each element in X an input-output function of a one-hidden-layer neural network with n hidden units and one linear output units unless for some f in X the error ||f-φ(f)|| exceeds the minimum possible error by more than ε. It is also shown that the additional multiplicative factor introduced into Barron´s bound (1992, 1993) by Kurkova, Savicky, and Hlavackova (1998) has an expected value of one half
Keywords :
approximation theory; geometry; multilayer perceptrons; I/O function; convex geometry; input-output function; multiplicative factor; neural approximation; nonlinear approximation; one-hidden-layer neural network; Chebyshev approximation; Fourier series; Geometry; Hilbert space; Linear approximation; Mathematics; Neural networks; Polynomials; Shape; Topology;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
Conference_Location :
Como
ISSN :
1098-7576
Print_ISBN :
0-7695-0619-4
Type :
conf
DOI :
10.1109/IJCNN.2000.857852
Filename :
857852
Link To Document :
بازگشت