DocumentCode
356750
Title
A weight evolution algorithm for finding the global minimum of error function in neural networks
Author
Ng, S.C. ; Leung, S.H.
Author_Institution
Dept. of Comput. & Math., Hong Kong Inst. of Vocational Educ., China
Volume
1
fYear
2000
fDate
2000
Firstpage
153
Abstract
This paper introduces a new weight evolution algorithm to find the global minimum of the error function in a multi-layered neural network. During the learning phase of backpropagation, the network weights are adjusted intentionally in order to have an improvement in system performance. By looking at the system outputs of the nodes, it is possible to adjust some of the network weights deterministically so as to achieve an overall reduction in system error. The idea is to work backward from the error components and the system outputs to deduce a deterministic perturbation on particular network weights for optimization purposes. Using the new algorithm, it is found that the weight evolution between the hidden and output layer can accelerate the convergence speed, whereas the weight evolution between the input layer and the hidden layer can assist in solving the local minima problem
Keywords
backpropagation; convergence; multilayer perceptrons; backpropagation; convergence speed; error function global minimum; learning; multilayered neural network; optimization; system error; system performance; weight evolution algorithm; Acceleration; Computer networks; Convergence; Intelligent networks; Mathematics; Multi-layer neural network; Neural networks; Neurons; Sufficient conditions; System performance;
fLanguage
English
Publisher
ieee
Conference_Titel
Evolutionary Computation, 2000. Proceedings of the 2000 Congress on
Conference_Location
La Jolla, CA
Print_ISBN
0-7803-6375-2
Type
conf
DOI
10.1109/CEC.2000.870289
Filename
870289
Link To Document