DocumentCode
3082007
Title
Fast and efficient and training of neural networks
Author
Yu, Hao ; Wilamowski
Author_Institution
Auburn Univ., Auburn, AL, USA
fYear
2010
fDate
13-15 May 2010
Firstpage
175
Lastpage
181
Abstract
In this paper, second order algorithms, such as Levenberg Marquardt algorithm, are recommended for neural network training. Being different from traditional computation in second order algorithms, the proposed method simplifies Hessian matrix computation, by removing Jacobian matrix computation and storage. Matrix multiplications are replaced by vector operations. The proposed computation not only makes the training process faster, but also reduces the memory cost significantly. Based upon the improvement, second order algorithms can be applied for application with unlimited number of patterns.
Keywords
Hessian matrices; Jacobian matrices; learning (artificial intelligence); matrix multiplication; neural nets; Hessian matrix computation; Jacobian matrix computation; Jacobian matrix storage; Levenberg Marquardt algorithm; matrix multiplications; neural network training; Computer networks; Costs; Error correction; Jacobian matrices; Neural networks; Neurons; Pattern matching; Signal processing; Signal processing algorithms; USA Councils; Levenberg Marquardt algorithm; Neural network training;
fLanguage
English
Publisher
ieee
Conference_Titel
Human System Interactions (HSI), 2010 3rd Conference on
Conference_Location
Rzeszow
Print_ISBN
978-1-4244-7560-5
Type
conf
DOI
10.1109/HSI.2010.5514571
Filename
5514571
Link To Document