Title :
Adaptive control design approximating solution of Hamilton-Jacobi-Bellman equation for nonlinear strict-feedback system with uncertainties
Author :
Okano, Keizo ; Hagino, Kojiro
Author_Institution :
Dept. of Syst. Eng., Univ. of Electro-Commun., Chofu
Abstract :
In this paper, we study an optimal control problem for a nonlinear system with uncertainties. It is shown that a positive definite differentiable function is convertible into the function which approximates the solution of an Hamilton-Jacobi-Bellman (HJB) equation by multiplying a scalar value coefficient to be adjusted for each state. And it is shown that a Lyapunov function designed by an adaptive backstepping method is also convertible into the function which approximates the solution of an HJB equation with an unknown parameter and an adaptive law for each state. The proposed controller doesnpsilat give the minimum value of an objective function but it decreases the value of an objective function in comparison with a backstepping controller. The effectiveness of the proposed controller is shown by numerical examples.
Keywords :
Lyapunov methods; adaptive control; approximation theory; control system synthesis; differential equations; feedback; nonlinear control systems; optimal control; uncertain systems; Hamilton-Jacobi-Bellman equation; Lyapunov function; adaptive backstepping method; adaptive control design; differentiable function; nonlinear strict-feedback system; optimal control problem; scalar value coefficient; Adaptive control; Backstepping; Control systems; Design engineering; Lyapunov method; Nonlinear equations; Nonlinear systems; Optimal control; Riccati equations; Uncertainty; Control Lyapunov function; Hamilon-Jacobi-Bellman (HJB) equation; adaptive control; nonlinear optimal control;
Conference_Titel :
SICE Annual Conference, 2008
Conference_Location :
Tokyo
Print_ISBN :
978-4-907764-30-2
Electronic_ISBN :
978-4-907764-29-6
DOI :
10.1109/SICE.2008.4654651