DocumentCode :
3288972
Title :
The discrete state linear quadratic problem
Author :
Quadrat, Jean-Pierre
Author_Institution :
INRIA, Rocquencourt, France
fYear :
1989
fDate :
13-15 Dec 1989
Firstpage :
736
Abstract :
The author poses and solves the control problems for discrete-state Markov chains when the control influences in an affine way the transition probabilities and the cost is quadratic with respect to the state and the control. In this case the dynamic programming equation becomes a Riccati one. This type of problem appears in particular when the Hamilton-Jacobi-Bellman equation (HJB) is discretized. The order of approximation of some schemes for discretizing the HJB equation is discussed in connection with these results
Keywords :
Markov processes; dynamic programming; probability; stochastic systems; Hamilton-Jacobi-Bellman equation; Markov chains; discrete state linear quadratic problem; dynamic programming; stochastic systems; transition probabilities; Boundary conditions; Cost function; Dynamic programming; Jacobian matrices; Linear systems; Riccati equations; Stability; State-space methods; Stochastic processes; Viscosity;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Decision and Control, 1989., Proceedings of the 28th IEEE Conference on
Conference_Location :
Tampa, FL
Type :
conf
DOI :
10.1109/CDC.1989.70215
Filename :
70215
Link To Document :
بازگشت