Title :
Powell´s dogleg trust-region steps with the quasi-Newton augmented Hessian for neural nonlinear least-squares learning
Author_Institution :
Dept. of Ind. Eng. & Oper. Res., California Univ., Berkeley, CA, USA
Abstract :
This paper highlights Powell´s dogleg trust-region algorithms with self-scaling quasi-Newton Hessian augmentation for neural-network (NN) nonlinear least squares problems. The dogleg algorithms approximate a restricted Levenberg-Marquardt step within the trust region of the local quadratic model in a piecewise-linear fashion. Furthermore, the second-derivative term of the Hessian is approximated by quasi-Newton iteration to obtain augmented Gauss-Newton model Hessian, which may be useful for highly nonlinear residuals when starting with a poor initial point (i.e., randomly initialized weight parameters). By small-scale examples, we illustrate how those devices come into play as a promising NN learning algorithm
Keywords :
iterative methods; learning (artificial intelligence); least squares approximations; neural nets; Gauss-Newton model; Levenberg-Marquardt step; Powell dogleg algorithm; iterative method; learning algorithm; neural-network; nonlinear least squares; quasiNewton Hessian augmentation; trust-region; Industrial engineering; Jacobian matrices; Least squares approximation; Least squares methods; Neural networks; Newton method; Operations research; Recursive estimation; Supervised learning; Training data;
Conference_Titel :
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-5529-6
DOI :
10.1109/IJCNN.1999.831138