Title :
Sample-path average optimality for Markov control processes
Author :
Lasserre, Jean B.
Author_Institution :
LAAS, CNRS, Toulouse, France
fDate :
10/1/1999 12:00:00 AM
Abstract :
The authors consider a Markov control process with Borel state and actions spaces, unbounded costs, and under the long-run sample-path average cost criterion. They prove that under very weak assumptions on the transition law and a moment assumption for the one-step cost, there exists a stationary policy with invariant probability distribution v, that is sample-path average cost optimal for v-almost all initial states. In addition, every expected average-cost optimal stationary policy is in fact (liminf) sample-path average-cost optimal and strongly expected average-cost optimal
Keywords :
Markov processes; decision theory; discrete time systems; probability; sampled data systems; Borel state spaces; Markov control processes; actions spaces; expected average-cost optimal stationary policy; invariant probability distribution; long-run sample-path average cost criterion; one-step cost; sample-path average optimality; transition law; unbounded costs; very weak assumptions; Adaptive control; Adaptive systems; Automatic control; Cost function; Nonlinear systems; Optimal control; Probability distribution; Process control; Programmable control; Sampling methods;
Journal_Title :
Automatic Control, IEEE Transactions on