DocumentCode
353274
Title
What inductive bias gives good neural network training performance?
Author
Snyders, S. ; Omlin, C.W.
Author_Institution
Dept. of Comput. Sci., Stellenbosch Univ., South Africa
Volume
3
fYear
2000
fDate
2000
Firstpage
445
Abstract
There has been an increased interest in the use of prior knowledge for training neural networks. Given a set of training examples and an initial domain theory, a neural network is constructed that fits the training examples by preprogramming some of the weights. The initialized neural network is then trained using backpropagation to refine the knowledge. This paper proposes a heuristic for determining the strength of the inductive bias by making use of gradient information in weight space in the direction of the programmed weights. The network starts its search in weight space where the gradient is maximal thus speeding-up convergence. Tests on a benchmark problem from molecular biology demonstrate that our heuristic, on average, reduces the training time by 60% compared to a random choice of the strength of the inductive bias; this performance is within 20% of the training time that can be achieved with the optimal inductive bias. The difference in generalization performance is not statistically significant
Keywords
backpropagation; convergence; feedforward neural nets; gradient methods; knowledge based systems; optimisation; backpropagation; convergence; gradient method; heuristic; inductive bias; initial domain theory; knowledge based neural network; learning performance; preprogramming; weight space; Africa; Artificial neural networks; Backpropagation; Computer networks; Computer science; Convergence; Feedforward neural networks; Neural networks; Sun; Testing;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
Conference_Location
Como
ISSN
1098-7576
Print_ISBN
0-7695-0619-4
Type
conf
DOI
10.1109/IJCNN.2000.861348
Filename
861348
Link To Document