DocumentCode
274175
Title
Weight limiting, weight quantisation and generalisation in multi-layer perceptrons
Author
Woodland, P.C.
Author_Institution
British Telecom Res. Labs., Ipswich, UK
fYear
1989
fDate
16-18 Oct 1989
Firstpage
297
Lastpage
300
Abstract
If a multilayer perceptron (MLP) is to be implemented on fixed point hardware then the robustness of the structure to weight quantisation is important. Most work on MLP performance totally neglects this issue and it is only addressed after a network has been trained. It is shown that both generalisation performance and robustness to weight quantisation can be improved by including explicit weight-range limiting into the MLP training procedure. This is illustrated by results of simulations on a speech recognition problem
Keywords
learning systems; neural nets; speech recognition; fixed point hardware; generalisation performance; multilayer perceptron; neural nets; robustness; speech recognition; training procedure; weight quantisation; weight-range limiting;
fLanguage
English
Publisher
iet
Conference_Titel
Artificial Neural Networks, 1989., First IEE International Conference on (Conf. Publ. No. 313)
Conference_Location
London
Type
conf
Filename
51979
Link To Document