Title :
Optimal Brain Surgeon and general network pruning
Author :
Hassibi, Babak ; Stork, David G. ; Wolff, Gregory J.
Author_Institution :
Dept. of Electr. Eng., Stamford Univ., CA, USA
Abstract :
The use of information from all second-order derivatives of the error function to perform network pruning (i.e., removing unimportant weights from a trained network) in order to improve generalization, simplify networks, reduce hardware or storage requirements, increase the speed of further training, and, in some cases, enable rule extraction, is investigated. The method, Optimal Brain Surgeon (OBS), is significantly better than magnitude-based methods and Optimal Brain Damage, which often remove the wrong weights. OBS, permits pruning of more weights than other methods (for the same error on the training set), and thus yields better generalization on test data. Crucial to OBS is a recursion relation for calculating the inverse Hessian matrix H-1 from training data and structural information of the set. OBS deletes the correct weights from a trained XOR network in every case
Keywords :
generalisation (artificial intelligence); learning (artificial intelligence); neural nets; Optimal Brain Surgeon; error function; general network pruning; generalization; inverse Hessian matrix; recursion relation; rule extraction; second-order derivatives; storage requirements; structural information; trained XOR network; Backpropagation; Benchmark testing; Biological neural networks; Data mining; Hardware; Machine learning; Pattern recognition; Statistics; Surges; Training data;
Conference_Titel :
Neural Networks, 1993., IEEE International Conference on
Conference_Location :
San Francisco, CA
Print_ISBN :
0-7803-0999-5
DOI :
10.1109/ICNN.1993.298572