DocumentCode :
1804202
Title :
Generating rules from trained network using fast pruning
Author :
Setiono, Rudy ; Leow, Wee Kheng
Author_Institution :
Sch. of Comput., Nat. Univ. of Singapore, Singapore
Volume :
6
fYear :
1999
fDate :
36342
Firstpage :
4095
Abstract :
Before symbolic rules are extracted from a trained neural network, the network is usually pruned so as to obtain more concise rules. Typical pruning algorithms require retraining the network which incurs additional cost. This paper presents FERNN, a fast method for extracting rules from trained neural networks without network re-training. Given a fully connected trained feedforward network, FERNN first identifies the relevant hidden units by computing their information gains. Next, it identifies relevant connections from the input units to the relevant hidden units by checking the magnitudes of their weights. Finally, FERNN generates rules based on the relevant hidden units and weights. Our experimental results show that the size and accuracy of the tree generated are comparable to those extracted by another method which prunes and retrains the network
Keywords :
decision trees; feedforward neural nets; learning (artificial intelligence); pattern classification; symbol manipulation; decision trees; feedforward neural network; pruning; rule extraction; symbolic classification; Artificial neural networks; Biological neural networks; Classification tree analysis; Computer networks; Costs; Decision trees; Entropy; Feedforward neural networks; Message-oriented middleware; Neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-5529-6
Type :
conf
DOI :
10.1109/IJCNN.1999.830817
Filename :
830817
Link To Document :
بازگشت