DocumentCode :
1944344
Title :
A brief discussion on moderatism based local gradient learning rules
Author :
Islam, Mohammad Tanvir ; Okabe, Yoichi
Author_Institution :
Dept. of Electron. Eng., Tokyo Univ., Japan
Volume :
2
fYear :
2003
fDate :
1-4 July 2003
Firstpage :
239
Abstract :
Moderatism [Y. Okabe et al., 1988], which is a learning rule for ANNs, is based on the principle that individual neurons and neural nets as a whole try to sustain a "moderate" level in their input and output signals. In this way, a close mutual relationship with the outside environment is maintained. In this paper, two potential moderatism-based local, gradient learning rules are proposed. Then, a pattern learning experiment is performed to compare the learning performances of these two learning rules, the error based weight update (EBWU) rule [Tanvir Islam, M et al., December 2001][Tanvir Islam, M et al., September 2001], and error backpropagation [Bishop, CM et al., 1995].
Keywords :
backpropagation; neural nets; signal processing; ANN learning rule; artificial neural networks; error backpropagation rule; error based weight update rule; local gradient learning rules; neurons; pattern learning experiment; potential moderatism-based local; Artificial neural networks; Backpropagation algorithms; Biological system modeling; Cost function; Equations; Error correction; Multi-layer neural network; Neural networks; Neurons;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Signal Processing and Its Applications, 2003. Proceedings. Seventh International Symposium on
Print_ISBN :
0-7803-7946-2
Type :
conf
DOI :
10.1109/ISSPA.2003.1224858
Filename :
1224858
Link To Document :
بازگشت