Title :
Convergence of learning algorithms with constant learning rates
Author :
Kuan, Chung-Ming ; Hornik, Kurt
Author_Institution :
Dept. of Econ., Illinois Univ., Urbana, IL, USA
fDate :
9/1/1991 12:00:00 AM
Abstract :
The behavior of neural network learning algorithms with a small, constant learning rate, ε, in stationary, random input environments is investigated. It is rigorously established that the sequence of weight estimates can be approximated by a certain ordinary differential equation, in the sense of weak convergence of random processes as ε tends to zero. As applications, backpropagation in feedforward architectures and some feature extraction algorithms are studied in more detail
Keywords :
convergence of numerical methods; differential equations; learning systems; neural nets; pattern recognition; random processes; backpropagation; convergence; differential equation; feature extraction; feedforward architectures; learning algorithms; neural network; random processes; weight estimates; Convergence; Differential equations; Environmental economics; Feature extraction; Information analysis; Interpolation; Neural networks; Pattern analysis; Random processes; Tail;
Journal_Title :
Neural Networks, IEEE Transactions on