DocumentCode :
565598
Title :
Multi-user multi-touch command and control of multiple simulated robots
Author :
McCann, Eric J. ; McSheehy, Sean ; Yanco, Holly A.
Author_Institution :
Dept. of Comput. Sci., Univ. of Massachusetts Lowell, Lowell, MA, USA
fYear :
2012
fDate :
5-8 March 2012
Firstpage :
413
Lastpage :
413
Abstract :
This video demonstrates three users sharing control of eight simulated robots with a Microsoft Surface and two Apple iPads using our Multi-user Multi-touch Multi-robot Command and Control Interface. The command and control interfaces are all capable of moving their world camera through space, tasking one or more robots with a series of waypoints, and assuming manual control of a single robot for inspection of its sensors and teleoperation. They display full-screen images sent from their user´s world camera, overlaid with icons that show the position and selection state of each robot in the camera´s field of view, dots that indicate each robot´s current destination, and rectangles that correspond to each other user´s field of view. One multi-touch interface runs on a Microsoft Surface, and the others on Apple iPads; they all have the same functional capabilities, other than a few differences due to the form factor and touch sensing method used by the platforms. The Surface interface is able to interpret gestures that include more than just finger tips, such as placing both fists on the screen to make all robots stop and wait for new commands. As iPads sense touch capacitively, they do not support detection of such gestures. The Surface interface allows its user to move their world camera while simultaneously teleoperating one of the robots with our Dynamically Resizing Ergonomic and Multi-touch Controller (DREAM Controller) [1, 2]. On the iPads, however, the command and control mode and teleoperation mode are mutually exclusive. The robots are simulated in Microsoft Robotics Developer Studio. Each user´s world camera has similar movement capabilities to a quad-copter. The UDP communications between users and robots are all handled by a single server that routes messages to the appropriate targets, allowing scalability of both the number of robots and users.
Keywords :
control engineering computing; ergonomics; gesture recognition; graphical user interfaces; inspection; message passing; mobile robots; multi-robot systems; robot vision; telerobotics; touch sensitive screens; transport protocols; video cameras; video signal processing; Apple iPads; DREAM controller; Microsoft Robotics Developer Studio; Microsoft Surface; UDP communications; camera field of view; dynamically resizing ergonomic and multitouch controller; full-screen image display; gesture interpretation; icons; manual control; message routing; movement capability; multiple simulated robots; multiuser multitouch multirobot command and control interface; sensor inspection; teleoperation; user world camera; Cameras; Command and control systems; Computer science; Educational institutions; Robot vision systems; Human-Robot Interaction; Multi-touch;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Human-Robot Interaction (HRI), 2012 7th ACM/IEEE International Conference on
Conference_Location :
Boston, MA
ISSN :
2167-2121
Print_ISBN :
978-1-4503-1063-5
Electronic_ISBN :
2167-2121
Type :
conf
Filename :
6249592
Link To Document :
بازگشت