DocumentCode
565815
Title
Hardware-assisted multiple object tracking for human-robot-interaction
Author
Lenz, Claus ; Panin, Giorgio ; Röder, Thorsten ; Wojtczyk, Martin ; Knoll, Alois
Author_Institution
Robot. & Embedded Syst. Lab., Tech. Univ., Garching, Germany
fYear
2009
fDate
11-13 March 2009
Firstpage
283
Lastpage
284
Abstract
At the moment, the collaboration of human and robot is mainly based on a master-slave level with a human teleoperating the robot or programming it off-line allowing the robot to execute only static tasks. In industrial production the limitations make a collaboration at the moment nearly impossible, because of the needed safety for the human worker. Therefore, to ensure safety, the workspaces of humans and robots are strictly separated in time or in space. This workspace splitting does not take advantage of the potential for humans and robots to work together as a team, where each member has the possibility to actively assume control and contribute towards solving a given task based on their capabilities. Such a mixed-initiative system supports a spectrum of control levels, allowing the human and robot to support each other in different ways, as needs and capabilities change throughout a task [4]. With the subsequent exibility and adaptability of a human-robot collaboration team, production scenarios in permanently changing environments as well as the manufacturing of highly customized products become possible.One step towards the goal of an efficient and safe collaboration between human and robot is to give the robot "eyes" to detect and track the human worker in order to avoid collisions, to figure out what the human is doing, and to be able to hand over objects. In this paper, we propose a hardware-assisted multiple-object tracking system for human-robot-interaction based on particle filters and pixellevel likelihoods. The proposed method computes for each multi-target particle a full hypothesis-map through the rendering engine of the graphics card, and compares it with the Copyright is held by the author/owner(s). HRl\´09, March 11-13, 2009, La loll a, California, USA. ACM 978-1-60558-404-1/09/03. 283 underlying binary map of the image-preprocessing on the fragment shader of the GPU. The approach is formulated in a generic way with respect to the segmentation me- hod, the object shape, and the number of targets to cover a magnitude of tasks within human-robot interaction. It is a further development of our work presented in [3] and will be used on our demonstration platform named JAHIR - Joint-Action for Humans and Industrial Robots [2] to enable such a fruitful and safe collaboration of human and industrial robot.
Keywords
graphics processing units; human-robot interaction; image segmentation; industrial robots; manufacturing systems; object tracking; rendering (computer graphics); safety; telerobotics; GPU; JAHIR; fragment shader; graphics card; hardware-assisted multiple object tracking; highly customized product manufacturing; human teleoperation; human worker safety; human-robot collaboration team; human-robot-interaction; image-preprocessing; industrial production; joint-action for humans and industrial robots; master-slave level; mixed-initiative system; particle filters; pixel-level likelihoods; production scenarios; rendering engine; segmentation method; static tasks; Abstracts; Computational modeling; Humans; Robots; USA Councils; Visualization; GPU; HRI; joint-action; model-based tracking;
fLanguage
English
Publisher
ieee
Conference_Titel
Human-Robot Interaction (HRI), 2009 4th ACM/IEEE International Conference on
Conference_Location
La Jolla, CA
ISSN
2167-2121
Print_ISBN
978-1-60558-404-1
Type
conf
Filename
6256065
Link To Document