DocumentCode :
1132126
Title :
Convergence of learning algorithms with constant learning rates
Author :
Kuan, Chung-Ming ; Hornik, Kurt
Author_Institution :
Dept. of Econ., Illinois Univ., Urbana, IL, USA
Volume :
2
Issue :
5
fYear :
1991
fDate :
9/1/1991 12:00:00 AM
Firstpage :
484
Lastpage :
489
Abstract :
The behavior of neural network learning algorithms with a small, constant learning rate, ε, in stationary, random input environments is investigated. It is rigorously established that the sequence of weight estimates can be approximated by a certain ordinary differential equation, in the sense of weak convergence of random processes as ε tends to zero. As applications, backpropagation in feedforward architectures and some feature extraction algorithms are studied in more detail
Keywords :
convergence of numerical methods; differential equations; learning systems; neural nets; pattern recognition; random processes; backpropagation; convergence; differential equation; feature extraction; feedforward architectures; learning algorithms; neural network; random processes; weight estimates; Convergence; Differential equations; Environmental economics; Feature extraction; Information analysis; Interpolation; Neural networks; Pattern analysis; Random processes; Tail;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.134285
Filename :
134285
Link To Document :
بازگشت