DocumentCode
404332
Title
Trajectory following optimization by gradient transformation differential equations
Author
Grantham, Walter I.
Author_Institution
Sch. of Mech. & Mater. Eng., Washington State Univ., Pullman, WA, USA
Volume
5
fYear
2003
fDate
9-12 Dec. 2003
Firstpage
5496
Abstract
For minimizing a scalar-valued function, we develop and investigate a family of gradient transformation differential equation algorithms. This family includes, as special cases: steepest descent, Newton\´s method, Levenberg-Marquardt, and a gradient enhanced Newton algorithm that we develop. Using Rosenbrock\´s "banana" function we study the stiffness of the gradient transformation family in terms of Lyapunov exponent time histories. For the example function, Newton\´s method and the Levenberg-Marquardt modification do not yield global asymptotic stability, whereas steepest descent does. However, Newton\´s method (from an initial point where it does work) is not stiff and is approximately 100 times as fast as steepest descent. In contrast, the gradient enhanced Newton method is globally convergent, is not stiff, and is approximately 25 times faster than Newton\´s method and approximately 2500 times faster than steepest descent.
Keywords
Lyapunov methods; Newton method; asymptotic stability; differential equations; gradient methods; optimisation; Levenberg Marquardt method; Lyapunov exponent; Newtons method; Rosenbrock´s function; differential equations; global asymptotic stability; gradient enhanced Newton algorithm; gradient transformation; scalar valued function; steepest descent; trajectory following optimization; Algorithm design and analysis; Control systems; Design optimization; Differential equations; History; Linear systems; Newton method; Nonlinear systems; Optimal control; Testing;
fLanguage
English
Publisher
ieee
Conference_Titel
Decision and Control, 2003. Proceedings. 42nd IEEE Conference on
ISSN
0191-2216
Print_ISBN
0-7803-7924-1
Type
conf
DOI
10.1109/CDC.2003.1272512
Filename
1272512
Link To Document