DocumentCode :
3495446
Title :
Belief-node condensation for online POMDP algorithms
Author :
Rens, Gavin ; Ferrein, Alexander
Author_Institution :
Centre for Artificial Intell. Res., Meraka Inst., Pretoria, South Africa
fYear :
2013
fDate :
9-12 Sept. 2013
Firstpage :
1
Lastpage :
5
Abstract :
We consider online partially observable Markov decision processes (POMDPs) which compute policies by local look-ahead from the current belief-state. One problem is that belief-nodes deeper in the decision-tree increase in the number of states with non-zero probability they contain. Computation time of updating a belief-state is exponential in the number of states contained by the belief. Belief-update occurs for each node in a search tree. It would thus pay to reduce the size of the nodes while keeping the information they contain. In this paper, we compare four fast and frugal methods to reduce the size of belief-nodes in the search tree, hence improving the running-time of online POMDP algorithms.
Keywords :
Markov processes; decision trees; multi-agent systems; probability; tree searching; POMDP; agent next action selection; belief-node condensation; belief-state; belief-update; computation time; decision-tree; local look-ahead; nonzero probability; online POMDP algorithm; online partially observable Markov decision processes; search tree; Heuristic algorithms; Markov processes; Planning; Probability distribution; Robots; Sensors; Uncertainty;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
AFRICON, 2013
Conference_Location :
Pointe-Aux-Piments
ISSN :
2153-0025
Print_ISBN :
978-1-4673-5940-5
Type :
conf
DOI :
10.1109/AFRCON.2013.6757747
Filename :
6757747
Link To Document :
بازگشت