DocumentCode :
2703008
Title :
Collaborative grasp planning with multiple object representations
Author :
Brook, Peter ; Ciocarlie, Matei ; Hsiao, Kaijen
Author_Institution :
Willow Garage Inc., Menlo Park, CA, USA
fYear :
2011
fDate :
9-13 May 2011
Firstpage :
2851
Lastpage :
2858
Abstract :
Grasp planning based on perceived sensor data of an object can be performed in different ways, depending on the chosen semantic interpretation of the sensed data. For example, if the object can be recognized and a complete 3D model is available, a different planning tool can be selected compared to the situation in which only the raw sensed data, such as a single point cloud, is available. Instead of choosing between these options, we present a framework that combines them, aiming to find consensus on how the object should be grasped by using the information from each object representation according to their confidence levels. We show that this method is robust to common errors in perception, such as incorrect object recognition, while also taking into account potential grasp execution errors due to imperfect robot calibration. We illustrate this method on the PR2 robot by grasping objects common in human environments.
Keywords :
mobile robots; multi-robot systems; object recognition; path planning; solid modelling; 3D model; PR2 robot; collaborative grasp planning; grasp execution error; human environment; multiple object representation; object recognition; object representation; robot calibration; semantic interpretation; sensor data; Clustering algorithms; Computational modeling; Grasping; Object recognition; Planning; Robot sensing systems;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Robotics and Automation (ICRA), 2011 IEEE International Conference on
Conference_Location :
Shanghai
ISSN :
1050-4729
Print_ISBN :
978-1-61284-386-5
Type :
conf
DOI :
10.1109/ICRA.2011.5980490
Filename :
5980490
Link To Document :
بازگشت