DocumentCode
2971014
Title
Fast backpropagation for supervised learning
Author
Ngolediage, J.E. ; Naguib, R.N.G. ; Dlay, S.S.
Author_Institution
Dept. of Electr. & Electron. Eng., Newcastle upon Tyne Univ., UK
Volume
3
fYear
1993
fDate
25-29 Oct. 1993
Firstpage
2591
Abstract
In this paper, fast backpropagation (Fbp), a new, simple and computationally efficient variant of the standard backpropagation, is proposed. It continuously adapts the learning rate parameter ε, for individual synapses, using only network variables, without any significant increase in circuit complexity. The method is related to Fermi-Dirac distribution which is based upon quantum principles. The ´mean´ update procedure employed offers a fascinating degree of stability and robustness. Even on individual runs Fbp, on average, converges quicker, particularly for non-Boolean inputs, and generalizes better than Quickprop with an identical set of initial random weights.
Keywords
convergence; learning (artificial intelligence); neural nets; Fermi-Dirac distribution; fast backpropagation; learning rate parameter; mean update procedure; nonBoolean inputs; robustness; stability; supervised learning; Arm; Circuit stability; Complexity theory; Difference equations; Electrons; Error correction; Robust stability; Supervised learning; Temperature distribution; Yttrium;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
Print_ISBN
0-7803-1421-2
Type
conf
DOI
10.1109/IJCNN.1993.714254
Filename
714254
Link To Document