DocumentCode :
3084831
Title :
Stochastic optimal control with indecomposible cost criteria
Author :
Hopkins, W.E.
Author_Institution :
Princeton University, Princeton, New Jersey
Volume :
26
fYear :
1987
fDate :
9-11 Dec. 1987
Firstpage :
768
Lastpage :
768
Abstract :
The formal minimization of a nonlinear cost functional for a diffusion process defined by d??1 = F1(??1, u)dt + G(??1)dw is straightforward if the functional can be represented in the form E{??(??1(T),??2(T)): ??1(0) = x1,??2(0) = c} where ?? is a point function, d/dt??2 = F2(??1,??2, u), and c is a constant. If the Hamilton-Jacobi-Bellman equation -Vt = 1/2tr(GGTVx1x1) + minu[< F1, Vx1 < + < F2, Vx2 >] V(T, x1, x2) = ??(x1,x2) has a smooth solution,the optimal cost is V(0, x1,c). When F2 is linear in ??2 and ?? is a linear or exponential function, the corresponding optimal control may depend only on ??1(t), as is the case for integral plus terminal cost criteria [2] and exponential of integral plus terminal cost criteria [1], [3], [4]. In general, however, the optimal control is a feedback function of both ??1(t) and the auxiliary variable ??2(t). For a special class of problems in which F2 is linear in ??2, a verification lemma is given and the Hamilton-Jacobi-Bellman equation is solved by the policy iteration method.
Keywords :
Cost function; Diffusion processes; Feedback; Integral equations; Jacobian matrices; Linear systems; Optimal control; Performance analysis; Stochastic processes; Stochastic systems;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Decision and Control, 1987. 26th IEEE Conference on
Conference_Location :
Los Angeles, California, USA
Type :
conf
DOI :
10.1109/CDC.1987.272493
Filename :
4049370
Link To Document :
بازگشت