DocumentCode :
3748873
Title :
Understanding Everyday Hands in Action from RGB-D Images
Author :
Gr?gory ;James S. Supancic;Deva Ramanan
Author_Institution :
Inria Rhone-Alpes, Grenoble, France
fYear :
2015
Firstpage :
3889
Lastpage :
3897
Abstract :
We analyze functional manipulations of handheld objects, formalizing the problem as one of fine-grained grasp classification. To do so, we make use of a recently developed fine-grained taxonomy of human-object grasps. We introduce a large dataset of 12000 RGB-D images covering 71 everyday grasps in natural interactions. Our dataset is different from past work (typically addressed from a robotics perspective) in terms of its scale, diversity, and combination of RGB and depth data. From a computer-vision perspective, our dataset allows for exploration of contact and force prediction (crucial concepts in functional grasp analysis) from perceptual cues. We present extensive experimental results with state-of-the-art baselines, illustrating the role of segmentation, object context, and 3D-understanding in functional grasp analysis. We demonstrate a near 2X improvement over prior work and a naive deep baseline, while pointing out important directions for improvement.
Keywords :
"Taxonomy","Force","Three-dimensional displays","Solid modeling","Robots","Kinematics","Cameras"
Publisher :
ieee
Conference_Titel :
Computer Vision (ICCV), 2015 IEEE International Conference on
Electronic_ISBN :
2380-7504
Type :
conf
DOI :
10.1109/ICCV.2015.443
Filename :
7410800
Link To Document :
بازگشت