Title of article :
Bernstein basis polynomials; Bernstein–Durrmeyer operator; Jacobi weight; Reproducing kernel Hilbert space; Korovkin type theorem
Author/Authors :
Hongyan Wang، نويسنده , , Dao-Hong Xiang، نويسنده , , Ding-Xuan Zhou، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2010
Pages :
16
From page :
599
To page :
614
Abstract :
Moving least-square (MLS) is an approximation method for data interpolation, numerical analysis and statistics. In this paper we consider the MLS method in learning theory for the regression problem. Essential differences between MLS and other common learning algorithms are pointed out: lack of a natural uniform bound for estimators and the pointwise definition. The sample error is estimated in terms of the weight function and the finite dimensional hypothesis space. The approximation error is dealt with for two special cases for which convergence rates for the total L2L2 error measuring the global approximation on the whole domain are provided.
Keywords :
Norming condition , Approximation error , Learning theory , Moving least-square method , Sample error
Journal title :
Journal of Approximation Theory
Serial Year :
2010
Journal title :
Journal of Approximation Theory
Record number :
852762
Link To Document :
بازگشت