Title of article :
Approximation with neural networks activated by ramp sigmoids Original Research Article
Author/Authors :
Gerald H.L. Cheang، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2010
Pages :
16
From page :
1450
To page :
1465
Abstract :
Accurate and parsimonious approximations for indicator functions of dd-dimensional balls and related functions are given using level sets associated with the thresholding of a linear combination of ramp sigmoid activation functions. In neural network terminology, we are using a single-hidden-layer perceptron network implementing the ramp sigmoid activation function to approximate the indicator of a ball. In order to have a relative accuracy ϵϵ, we use T=c(d2/ϵ2)T=c(d2/ϵ2) ramp sigmoids, a result comparable to that of Cheang and Barron (2000) [4], where unit step activation functions are used instead. The result is then applied to functions that have variation VfVf with respect to a class of ellipsoids. Two-hidden-layer feedforward neural nets with ramp sigmoid activation functions are used to approximate such functions. The approximation error is shown to be bounded by a constant times View the MathML sourceVf/T112+Vfd/T214, where T1T1 is the number of nodes in the outer layer and T2T2 is the number of nodes in the inner layer of the approximation fT1,T2fT1,T2.
Keywords :
Two-hidden-layer neural net approximation , Approximation error bounds , Approximation of ellipsoids , Ramp sigmoids
Journal title :
Journal of Approximation Theory
Serial Year :
2010
Journal title :
Journal of Approximation Theory
Record number :
852811
Link To Document :
بازگشت