Title :
Strategic planning under uncertainties via constrained Markov Decision Processes
Author :
Xu Chu Ding ; Pinto, Allan ; Surana, Amit
Author_Institution :
Dept. of Syst., United Technol. Res. Center, East Hartford, CT, USA
Abstract :
In this paper, we propose a hierarchical mission planner where the state of the world and of the mission are abstracted into corresponding states of a Markov Decision Process (MDP). Transitions in the MDP represent abstract motion actions that are planned by a lower level probabilistic planner. The cost structure of the MDP is multi-dimensional: each state-action pair is annotated with a vector of metrics such as time and resource requirements. A mission specification is divided into three parts: a temporal logic formula defined over state propositions, the choice of the primary cost, and constraints on the remaining secondary costs. The planning problem is formulated as finding the optimal policy of a Constrained Markov Decision Process with above mission specification. The resulting planning system is tested in a mission where an agent is tasked with a complex mission in a urban hostile environment.
Keywords :
Markov processes; formal verification; planning (artificial intelligence); temporal logic; abstract motion actions; constrained Markov decision process; hierarchical mission planner; lower level probabilistic planner; mission specification; state-action pair; temporal logic formula; Abstracts; Automata; Autonomous agents; Doped fiber amplifiers; Markov processes; Planning; Uncertainty;
Conference_Titel :
Robotics and Automation (ICRA), 2013 IEEE International Conference on
Conference_Location :
Karlsruhe
Print_ISBN :
978-1-4673-5641-1
DOI :
10.1109/ICRA.2013.6631226