DocumentCode
2584231
Title
Using a minimal action grammar for activity understanding in the real world
Author
Summers-Stay, Douglas ; Teo, Ching L. ; Yang, Yezhou ; Fermüller, Cornelia ; Aloimonos, Yiannis
Author_Institution
Dept. of Comput. Sci., Univ. of Maryland, College Park, MD, USA
fYear
2012
fDate
7-12 Oct. 2012
Firstpage
4104
Lastpage
4111
Abstract
There is good reason to believe that humans use some kind of recursive grammatical structure when we recognize and perform complex manipulation activities. We have built a system to automatically build a tree structure from observations of an actor performing such activities. The activity trees that result form a framework for search and understanding, tying action to language. We explore and evaluate the system by performing experiments over a novel complex activity dataset taken using synchronized Kinect and SR4000 Time of Flight cameras. Processing of the combined 3D and 2D image data provides the necessary terminals and events to build the tree from the bottom-up. Experimental results highlight the contribution of the action grammar in: 1) providing a robust structure for complex activity recognition over real data and 2) disambiguating interleaved activities from within the same sequence.
Keywords
gesture recognition; grammars; SR4000 time of flight camera; activity trees; activity understanding; complex activity recognition; complex manipulation activity; minimal action grammar; recursive grammatical structure; synchronized Kinect; tree structure; Cameras; Feature extraction; Grammar; Hidden Markov models; Humans; Object recognition; Vegetation;
fLanguage
English
Publisher
ieee
Conference_Titel
Intelligent Robots and Systems (IROS), 2012 IEEE/RSJ International Conference on
Conference_Location
Vilamoura
ISSN
2153-0858
Print_ISBN
978-1-4673-1737-5
Type
conf
DOI
10.1109/IROS.2012.6385483
Filename
6385483
Link To Document