DocumentCode
423658
Title
Deterministic weight modification algorithm for efficient learning
Author
Ng, S.C. ; Cheung, C.C. ; Leung, S.H.
Author_Institution
Sch. of Sci. & Tech., Open Univ. of HK, Hong Kong, China
Volume
2
fYear
2004
fDate
25-29 July 2004
Firstpage
1033
Abstract
This paper presents a new approach using deterministic weight modification (DWM) to speed up the convergence rate effectively and improve the global convergence capability of the standard and modified back-propagation (BP) algorithms. The main idea of DWM is to reduce the system error by changing the weights of a multi-layered feed-forward neural network in a deterministic way. Simulation results show that the performance of DWM is better than BP and other modified BP algorithms for a number of learning problems.
Keywords
backpropagation; convergence; feedforward neural nets; gradient methods; multilayer perceptrons; convergence rate; deterministic weight modification algorithm; global convergence capability; gradient methods; modified backpropagation algorithms; multilayered feedforward neural network; system error reduction; Computational modeling; Convergence; Feedforward neural networks; Feedforward systems; Genetic algorithms; Multi-layer neural network; Neural networks; Neurons; Optimization methods; Simulated annealing;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
ISSN
1098-7576
Print_ISBN
0-7803-8359-1
Type
conf
DOI
10.1109/IJCNN.2004.1380076
Filename
1380076
Link To Document