DocumentCode :
2259201
Title :
A weight evolution algorithm with deterministic perturbation
Author :
Ng, Sin-Chun ; Leung, Shu-hung ; Luk, Andrew
Author_Institution :
Dept. of Comput. & Math, Hong Kong Inst. of Vocational Educ., China
Volume :
1
fYear :
2000
fDate :
2000
Firstpage :
185
Abstract :
Introduces a learning algorithm for multi-layered feedforward network-weight evolution algorithm with deterministic perturbation. During the learning phase of the gradient algorithm (such as backpropagation), the network weights are adjusted intentionally in order to have an improvement in system performance. The intention is to reduce the overall system error after every weight update. By looking at the error component, it is possible to adjust some of the network weights dramatically so as to have an overall reduction in system error. Using the deterministic perturbation, it is found that the weight evolution between the hidden and output layer can accelerate the convergence speed, whereas the weight evolution between the input layer and the hidden layer can assist in solving the local minima problem
Keywords :
convergence; feedforward neural nets; gradient methods; learning (artificial intelligence); multilayer perceptrons; convergence speed; deterministic perturbation; gradient algorithm; hidden layer; learning algorithm; local minima problem; multi-layered feedforward network; network weights; output layer; weight evolution algorithm; Acceleration; Australia; Convergence; Equations; Error correction; Investments; Mathematics; System performance;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
Conference_Location :
Como
ISSN :
1098-7576
Print_ISBN :
0-7695-0619-4
Type :
conf
DOI :
10.1109/IJCNN.2000.857834
Filename :
857834
Link To Document :
بازگشت