DocumentCode :
1645421
Title :
Removing decision surface skew using complementary inputs
Author :
Andersen, Timothy L.
Author_Institution :
Comput. Sci. Dept., Boise State Univ., ID, USA
Volume :
1
fYear :
2002
fDate :
6/24/1905 12:00:00 AM
Firstpage :
263
Lastpage :
267
Abstract :
Examines the tendency of backpropagation-based training algorithms to favor examples that have large input feature values, in terms of the ability of such examples to influence the weights of the network, and shows that this tendency can lead to sub-optimal decision surfaces. We propose a method for counteracting this tendency that modifies the original input feature vector through the addition of complementary inputs
Keywords :
backpropagation; multilayer perceptrons; backpropagation-based training algorithms; complementary inputs; decision surface skew; network weights; sub-optimal decision surfaces; Backpropagation algorithms; Computer science; Encoding; Equations; Guidelines; Multilayer perceptrons; Neural networks; Neurons; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2002. IJCNN '02. Proceedings of the 2002 International Joint Conference on
Conference_Location :
Honolulu, HI
ISSN :
1098-7576
Print_ISBN :
0-7803-7278-6
Type :
conf
DOI :
10.1109/IJCNN.2002.1005480
Filename :
1005480
Link To Document :
بازگشت