DocumentCode :
295812
Title :
Inserting background knowledge in perceptrons through modification of the learning algorithm
Author :
Bode, Jürgen ; Liang, Xun ; Zhang, Xiping ; Ren, Shouju
Author_Institution :
Dept. of Autom., Tsinghua Univ., Beijing, China
Volume :
2
fYear :
1995
fDate :
Nov/Dec 1995
Firstpage :
807
Abstract :
Usually, knowledge to be learned by neural networks is represented implicitly in the training samples. The ability to insert knowledge apart from the implicit representations in training samples (“background knowledge”) gives rise to the hope that the learning and operation behavior of neural networks can be improved. In this paper, we develop a method to accomplish the insertion of expert knowledge into the error function during training. We modify the backpropagation learning algorithm such that the network is trained not only to minimize output error but also to consider further information provided by the expert users who train a multilayer perceptron with one hidden layer. The results are tested with an artificial example from design cost estimation using very small training set sizes of 10 samples. They show significant improvement compared to approaches which do not consider background knowledge
Keywords :
backpropagation; multilayer perceptrons; background knowledge insertion; backpropagation; implicit representations; learning algorithm modification; multilayer perceptron; output error minimization; training samples; Automation; Backpropagation algorithms; Computer science; Knowledge based systems; Management training; Multilayer perceptrons; Network topology; Neural networks; Product design; Testing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1995. Proceedings., IEEE International Conference on
Conference_Location :
Perth, WA
Print_ISBN :
0-7803-2768-3
Type :
conf
DOI :
10.1109/ICNN.1995.487521
Filename :
487521
Link To Document :
بازگشت