DocumentCode :
1264378
Title :
A simple procedure for pruning back-propagation trained neural networks
Author :
Karnin, Ehud D.
Author_Institution :
IBM Sci. & Technol., Technion City, Haifa, Israel
Volume :
1
Issue :
2
fYear :
1990
fDate :
6/1/1990 12:00:00 AM
Firstpage :
239
Lastpage :
242
Abstract :
The sensitivity of the global error (cost) function to the inclusion/exclusion of each synapse in the artificial neural network is estimated. Introduced are shadow arrays which keep track of the incremental changes to the synaptic weights during a single pass of back-propagating learning. The synapses are then ordered by decreasing sensitivity numbers so that the network can be efficiently pruned by discarding the last items of the sorted list. Unlike previous approaches, this simple procedure does not require a modification of the cost function, does not interfere with the learning process, and demands a negligible computational overhead
Keywords :
learning systems; neural nets; back-propagation; cost function; global error; learning process; neural networks; sensitivity; shadow arrays; synaptic weights; Artificial neural networks; Cities and towns; Computational efficiency; Computer networks; Cost function; Learning systems; Logistics; Neural networks; Neurons; Training data;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.80236
Filename :
80236
Link To Document :
بازگشت