Title of article
Ditzain-Totik modulus of smoothness for the fractional derivative of functions in Lp space of the partial neural network
Author/Authors
Hassan Ibrahim, Amenah Department of Mathematics - Collage of Sciences - AL-Mustansiriyah University, Baghdad, Iraq , Samir Bhaya, Eman Department of Mathematics - Collage of Education for Pure Sciences - University of Babylon, Iraq , Ali Hessen, Eman Department of Mathematics - Collage of Sciences - AL-Mustansiriyah University, Baghdad, Iraq
Pages
13
From page
3305
To page
3317
Abstract
Some scientists studied the weighted approximation of the partial neural network, but in this paper, we studied the weighted Ditzain-Totik modulus of smoothness for the fractional derivative of functions in Lp of the partial neural network and this approximation of the real-valued functions over a compressed period by the tangent sigmoid and quasi-interpolation operators. These approximations measurable left and right partial Caputo models of the committed function. Approximations are bitmap with respect to the standard base. Feed-forward neural networks with a single hidden layer. Our higher-order fractal approximation results in better convergence than normal approximation with some applications. All proved results are in Lp[X] spaces, where 0
Keywords
approximation , Ditzain-Totik modulus , higher-order fractal approximation , partial Caputo models , partial neural network
Journal title
International Journal of Nonlinear Analysis and Applications
Serial Year
2022
Record number
2714122
Link To Document