DocumentCode :
2694240
Title :
Relative effectiveness of training set patterns for backpropagation
Author :
Cheung, Raymond K M ; Lustig, Irving ; Kornhauser, A.L.
fYear :
1990
fDate :
17-21 June 1990
Firstpage :
673
Abstract :
Backpropagation (BP) is a gradient-descent method to search for optimal parameter settings in a neural network model. This learning process can be split into three stages according to the behavior of the errors produced. The analysis of the different behaviors in these stages shows the existence of poorly trained patterns which have great influence on the performance of the BP mode. The benefit of considering the relative effectiveness of training patterns is investigated using two modified BP training procedures. The versions are based on information about relative importance. The results show that the learning speed and generalization ability are improved
Keywords :
learning systems; neural nets; backpropagation; generalization ability; gradient-descent method; learning process; learning speed; optimal parameter settings; poorly trained patterns; supervised learning; training set effectiveness; training set patterns;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1990., 1990 IJCNN International Joint Conference on
Conference_Location :
San Diego, CA, USA
Type :
conf
DOI :
10.1109/IJCNN.1990.137646
Filename :
5726606
Link To Document :
بازگشت