Title of article :
Deterministic convergence of an online gradient method for neural networks
Author/Authors :
Wu، نويسنده , , Wei and Xu، نويسنده , , Yuesheng، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2002
Pages :
13
From page :
335
To page :
347
Abstract :
The online gradient method has been widely used as a learning algorithm for neural networks. We establish a deterministic convergence of online gradient methods for the training of a class of nonlinear feedforward neural networks when the training examples are linearly independent. We choose the learning rate η to be a constant during the training procedure. The monotonicity of the error function in the iteration is proved. A criterion for choosing the learning rate η is also provided to guarantee the convergence. Under certain conditions similar to those for the classical gradient methods, an optimal convergence rate for our online gradient methods is proved.
Keywords :
Constant learning rate , Nonlinear feedforward Neural networks , Monotonicity , Deterministic convergence , Online stochastic gradient method
Journal title :
Journal of Computational and Applied Mathematics
Serial Year :
2002
Journal title :
Journal of Computational and Applied Mathematics
Record number :
1551813
Link To Document :
بازگشت