DocumentCode :
2647779
Title :
On the extent of simplifications in backpropagation training equations
Author :
Bahrami, Mohammad
Author_Institution :
Sch. of Electr. Eng., New South Wales Univ., Sydney, NSW, Australia
fYear :
1994
fDate :
29 Nov-2 Dec 1994
Firstpage :
18
Lastpage :
21
Abstract :
Usually the simplified form of the backpropagation training equations are used instead of the original equations for training of neural networks. The validity of these simplifications and the extent to which these methods can operate is explored. It is shown that although these methods can reduce the time and complexity of computations carried out during training, their effectiveness is problem dependent
Keywords :
backpropagation; computational complexity; neural nets; backpropagation training equations; neural network training; problem dependent effectiveness; simplified form; Australia; Backpropagation; Computational complexity; Decoding; Differential equations; Helium; Intelligent networks; Mars; Multilayer perceptrons; Neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Intelligent Information Systems,1994. Proceedings of the 1994 Second Australian and New Zealand Conference on
Conference_Location :
Brisbane, Qld.
Print_ISBN :
0-7803-2404-8
Type :
conf
DOI :
10.1109/ANZIIS.1994.396958
Filename :
396958
Link To Document :
بازگشت