DocumentCode
391309
Title
On the relation of reachability to minimum cost optimal control
Author
Lygeros, John
Author_Institution
Dept. of Eng., Cambridge Univ., UK
Volume
2
fYear
2002
fDate
10-13 Dec. 2002
Firstpage
1910
Abstract
Questions of reachability for continuous and hybrid systems can be formulated as optimal control or game theory problems, whose solution can be characterised using variants of the Hamilton-Jacobi-Bellman or Isaacs partial differential equations. This paper establishes a link between reachability and invariance problems and viscosity solutions of a Hamilton-Jacobi partial differential equation, developed to address optimal control problems where the cost function is the minimum of a function of the state over a given horizon. The form of the resulting partial differential equation (continuity of the Hamiltonian and simple boundary conditions) makes this approach especially attractive from the point of view of numerical computation.
Keywords
cost optimal control; dynamic programming; partial differential equations; Hamilton-Jacobi-Bellman partial differential equations; Isaacs partial differential equations; dynamic programming; game theory; invariance problems; minimum cost optimal control; reachability relation; Biological control systems; Biology computing; Boundary conditions; Control systems; Cost function; Differential equations; Dynamic programming; Optimal control; Partial differential equations; Viscosity;
fLanguage
English
Publisher
ieee
Conference_Titel
Decision and Control, 2002, Proceedings of the 41st IEEE Conference on
ISSN
0191-2216
Print_ISBN
0-7803-7516-5
Type
conf
DOI
10.1109/CDC.2002.1184805
Filename
1184805
Link To Document