DocumentCode :
329061
Title :
Asymptotic insensitivity to weights perturbations in back-propagation classifiers
Author :
Alippi, Cesare
Author_Institution :
Dipartimento di Elettronica, Politecnico di Milano, Italy
Volume :
2
fYear :
1993
fDate :
25-29 Oct. 1993
Firstpage :
1713
Abstract :
The author deals with the asymptotic behaviour of weights in a backpropagation classifier. The goal is to identify the presence of an insensitivity level associable with the network, the level generated during training and capable of masking the loss in performance induced by perturbations on weights. The insensitivity level increases by overtraining the classifier. By adapting Tesauro´s (1990) asymptotical results the leading order weights´s behaviour is compared to the insensitivity degree provided previously by the author (1992). In particular, it is shown that there exists a training epoch for which the classifier is insensitive to a given set of bounded perturbations on weights. Removal of a single connection is not a bounded perturbation and needs to be differently analysed. In particular, by removing a connection in each of the output neurons: 1) networks without hidden units may not assure a correct classification; and 2) networks with hidden layers assure proper classifications whenever the hidden units do not saturate late in learning.
Keywords :
backpropagation; neural nets; pattern classification; sensitivity analysis; asymptotic insensitivity; backpropagation classifier; bounded perturbations; hidden units; learning; pattern classification; weights; weights perturbations; Computer science; Convergence; Degradation; Differential equations; Fault tolerance; Feedforward systems; Neurons; Performance loss; Robustness; Sensitivity analysis;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
Print_ISBN :
0-7803-1421-2
Type :
conf
DOI :
10.1109/IJCNN.1993.716984
Filename :
716984
Link To Document :
بازگشت