Title :
Analysis of weight initialization techniques for Gradient Descent algorithm
Author :
Sarfaraz Masood;M. N. Doja;Pravin Chandra
Author_Institution :
Deptt. of Comp. Engg. Jamia Millia Islamia, New Delhi (INDIA)
Abstract :
Gradient Descent Backpropagation is the most commonly used training algorithm for the artificial neural networks. In this paper, we have conducted experiments to perform a detailed comparison of various well known weight initialization techniques given by Nguyen-Widrow, Drago et al, Kim and Ra, Chen and Nutter, Bottou etc and hence identify the best weight initialization method for the gradient descent approach. Six functions approximation problems were chosen for experimentation. Results of one sided tailed t-test, mean and standard deviation of test error were used for the decision making purpose. Results strongly suggest that the Nguyen and Widrow method is the best suited method for the Gradient Descent training algorithm.
Conference_Titel :
India Conference (INDICON), 2015 Annual IEEE
Electronic_ISBN :
2325-9418
DOI :
10.1109/INDICON.2015.7443734