DocumentCode
352495
Title
A new method to prune the neural network
Author
Wan, Weishui ; Hirasawa, Kotaro ; Hu, Jinglu ; Jin, ChunZhi
Author_Institution
Graduate Sch. of Inf. Sci. & Electr. Eng., Kyushu Univ., Fukuoka, Japan
Volume
6
fYear
2000
fDate
2000
Firstpage
449
Abstract
Using the backpropagation algorithm (BP) to train neural networks is a widely adopted practice in both theory and practical applications. However, its distributed weight representation, that is the weight matrix of final network after training by using BP are usually not sparsified, and prohibits its use in the rule discovery of inherent functional relations between the input and output data, so in this aspect some kinds of structure optimization are needed to improve its poor performance. In this paper, with this in mind, a new method to prune neural networks is proposed based on some statistical quantities of neural networks. Comparing with the other known pruning methods such as the structural learning with forgetting and RPROP algorithm, the proposed method can attain comparable or even better results over these methods without evident increase of the computational load. Detailed simulations using the Iris data sets exhibit our above assertion
Keywords
backpropagation; data mining; neural nets; optimisation; statistical analysis; backpropagation; learning algorithm; neural network; pruning; rule discovery; statistical analysis; structure optimization; weight matrix; Backpropagation algorithms; Computational modeling; Convergence; Data mining; Information science; Intelligent control; Iris; Laboratories; Neural networks; Stochastic processes;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
Conference_Location
Como
ISSN
1098-7576
Print_ISBN
0-7695-0619-4
Type
conf
DOI
10.1109/IJCNN.2000.859436
Filename
859436
Link To Document