DocumentCode :
3526807
Title :
Multi-armed recommendation bandits for selecting state machine policies for robotic systems
Author :
Matikainen, Pyry ; Furlong, P. Michael ; Sukthankar, Rahul ; Hebert, Martial
Author_Institution :
Robot. Inst., Carnegie Mellon Univ., Pittsburgh, PA, USA
fYear :
2013
fDate :
6-10 May 2013
Firstpage :
4545
Lastpage :
4551
Abstract :
We investigate the problem of selecting a state-machine from a library to control a robot. We are particularly interested in this problem when evaluating such state machines on a particular robotics task is expensive. As a motivating example, we consider a problem where a simulated vacuuming robot must select a driving state machine well-suited for a particular (unknown) room layout. By borrowing concepts from collaborative filtering (recommender systems such as Netflix and Amazon.com), we present a multi-armed bandit formulation that incorporates recommendation techniques to efficiently select state machines for individual room layouts. We show that this formulation outperforms the individual approaches (recommendation, multi-armed bandits) as well as the baseline of selecting the `average best´ state machine across all rooms.
Keywords :
finite state machines; information filtering; intelligent robots; learning (artificial intelligence); manipulators; recommender systems; service robots; collaborative filtering; multiarmed recommendation bandits; recommender systems; robotic systems; room layout; simulated vacuuming robot; state machine policies; Collaboration; Collision avoidance; Layout; Libraries; Robot sensing systems; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Robotics and Automation (ICRA), 2013 IEEE International Conference on
Conference_Location :
Karlsruhe
ISSN :
1050-4729
Print_ISBN :
978-1-4673-5641-1
Type :
conf
DOI :
10.1109/ICRA.2013.6631223
Filename :
6631223
Link To Document :
بازگشت