DocumentCode :
1638006
Title :
Input modality and task complexity: Do they relate?
Author :
Stollnberger, Gerald ; Weiss, Adam ; Tscheligi, Manfred
Author_Institution :
CDL on Contextual Interfaces, Univ. of Salzburg, Salzburg, Austria
fYear :
2013
Firstpage :
233
Lastpage :
234
Abstract :
In the research field of Human-Robot Collaboration (HRC) choosing the right input modality is a crucial aspect for successful cooperation, especially for different levels of task complexity. In this paper we present a preliminary study we conducted in order to investigate the correlation between input modalities and task complexity. We assume that specific input devices are suitable for specific levels of task complexity in HRC tasks. In our study participants could choose between two different input modalities to perform a race with Lego Mindstorm robots against each other. One of the main findings was that both factors (input modality / task complexity) have a severe impact on task performance and user satisfaction. Furthermore, we found out that users´ perceptions of their performance differed from reality in some cases.
Keywords :
gesture recognition; human-robot interaction; mobile radio; telerobotics; HRC tasks; Lego Mindstorm robots; PC-remote control; gesture-based interface; human-robot collaboration; input modalities; mobile phone; task complexity; task performance; user satisfaction; Atmospheric measurements; Complexity theory; Particle measurements; Performance evaluation; Service robots; Time measurement; Human-Robot collaboration; gesture; input modalities; mindstorm; remote; speech; task complexity;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Human-Robot Interaction (HRI), 2013 8th ACM/IEEE International Conference on
Conference_Location :
Tokyo
ISSN :
2167-2121
Print_ISBN :
978-1-4673-3099-2
Electronic_ISBN :
2167-2121
Type :
conf
DOI :
10.1109/HRI.2013.6483587
Filename :
6483587
Link To Document :
بازگشت