Title :
Removing decision surface skew using complementary inputs
Author :
Andersen, Timothy L.
Author_Institution :
Comput. Sci. Dept., Boise State Univ., ID, USA
fDate :
6/24/1905 12:00:00 AM
Abstract :
Examines the tendency of backpropagation-based training algorithms to favor examples that have large input feature values, in terms of the ability of such examples to influence the weights of the network, and shows that this tendency can lead to sub-optimal decision surfaces. We propose a method for counteracting this tendency that modifies the original input feature vector through the addition of complementary inputs
Keywords :
backpropagation; multilayer perceptrons; backpropagation-based training algorithms; complementary inputs; decision surface skew; network weights; sub-optimal decision surfaces; Backpropagation algorithms; Computer science; Encoding; Equations; Guidelines; Multilayer perceptrons; Neural networks; Neurons; Training data;
Conference_Titel :
Neural Networks, 2002. IJCNN '02. Proceedings of the 2002 International Joint Conference on
Conference_Location :
Honolulu, HI
Print_ISBN :
0-7803-7278-6
DOI :
10.1109/IJCNN.2002.1005480