DocumentCode :
3592655
Title :
Applied self-recovery technique to link and neuron prunings
Author :
Lursinsap, C.
Author_Institution :
Center for Adv. Comput. Studies, Southwestern Louisiana Univ., Lafayette, LA, USA
Volume :
1
fYear :
1994
Firstpage :
545
Abstract :
Pruning algorithms based on shifting the weights of pruned links and/or neurons to the surviving parts of the network are described. After network training has completed, each hidden neuron that has insignificant effect on the performance is removed and its connection weights are shifted to other links. Experiment results show that in a classification problem, 5% to 45% of the links can be removed while the pruned network can still have essentially the same performance as the unpruned network. This technique does not require a retraining process or modification of the error cost function. The time complexities of the link and neuron pruning algorithms are O(n2) and O(m), respectively, where n is the number of links of a neuron and m is the number of neurons in the given network
Keywords :
learning (artificial intelligence); neural nets; pattern classification; redundancy; algorithm; classification; link pruning; network training; neuron pruning; self-recovery; time complexity; weight shifting; Computational efficiency; Computer networks; Cost function; Hardware; Ink; Neural networks; Neurons; Redundancy; Thumb;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Circuits and Systems, 1994., Proceedings of the 37th Midwest Symposium on
Print_ISBN :
0-7803-2428-5
Type :
conf
DOI :
10.1109/MWSCAS.1994.519297
Filename :
519297
Link To Document :
بازگشت