DocumentCode :
2831211
Title :
A new dynamic optimal learning rate for a two-layer neural network
Author :
Zhang, Tong ; Chen, C. L Philip ; Wang, Chi-Hsu ; Tam, Sik Chung
Author_Institution :
Dept. of Comput. & Inf. Sci., Univ. of Macau, Chung, China
fYear :
2012
fDate :
June 30 2012-July 2 2012
Firstpage :
55
Lastpage :
59
Abstract :
The learning rate is crucial for the training process of a two-layer neural network (NN). Therefore, many researches have been done to find the optimal learning rate so that maximum error reduction can be achieved in all iterations. However, in this paper, we found that the best learning rate can be further improved. In saying so, we have revised the direction to search for a new dynamic optimal learning rate, which can have a better convergence in less iteration count than previous approach. There exists a ratio k between out new optimal learning rate and the previous one after the first iteration. In contrast to earlier approaches, the new optimal learning rate of the two-layer NN has a better performance in the same experiment. So we can conclude that our new dynamic optimal learning rate can be a very useful one for the applications of neural networks.
Keywords :
convergence; iterative methods; learning (artificial intelligence); neural nets; convergence; dynamic optimal learning rate; iteration count; maximum error reduction; training process; two-layer neural network; Artificial neural networks; Convergence; Equations; Heuristic algorithms; Training; Vectors; learning rate; neural network; new optimal learning rate; ratio k; two-layer NN;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
System Science and Engineering (ICSSE), 2012 International Conference on
Conference_Location :
Dalian, Liaoning
Print_ISBN :
978-1-4673-0944-8
Electronic_ISBN :
978-1-4673-0943-1
Type :
conf
DOI :
10.1109/ICSSE.2012.6257148
Filename :
6257148
Link To Document :
بازگشت