DocumentCode
303244
Title
Fuzzy logic adapted nodal training parameter
Author
Gelder, Michael S.
Author_Institution
Sch. of Manuf. & Mech. Eng., Birmingham Univ., UK
Volume
1
fYear
1996
fDate
3-6 Jun 1996
Firstpage
387
Abstract
A technique is outlined for improving the learning rate of a multilayer perceptron (MLP) network. Each network node is assigned its own training rate parameter which is adapted using fuzzy logic as part of the error backpropagation process. This involves the development of target values for hidden layer node output. These values are based on the current network weight state and are therefore different for each epoch. Using two test vector distributions it is demonstrated that this approach can reduce MLP convergence time and is compared to three other training methods: standard backpropagation, fuzzy adapted global training rate parameter, and the delta-bar-delta learning rule
Keywords
backpropagation; convergence; fuzzy logic; fuzzy neural nets; multilayer perceptrons; neural net architecture; performance index; convergence time; delta-bar-delta learning rule; error backpropagation; fuzzy logic; hidden layer node output; multilayer perceptron; nodal training parameter; performance index; weight state; Computer networks; Convergence; Fuzzy logic; Jacobian matrices; Learning systems; Manufacturing; Mechanical engineering; Multilayer perceptrons; Performance analysis; Testing;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1996., IEEE International Conference on
Conference_Location
Washington, DC
Print_ISBN
0-7803-3210-5
Type
conf
DOI
10.1109/ICNN.1996.548923
Filename
548923
Link To Document