DocumentCode :
30773
Title :
Evaluation of Head Gaze Loosely Synchronized With Real-Time Synthetic Speech for Social Robots
Author :
Srinivasan, V. ; Bethel, Cindy L. ; Murphy, R.R.
Author_Institution :
Dept. of Comput. Sci. & Eng., Texas A&M Univ., College Station, TX, USA
Volume :
44
Issue :
6
fYear :
2014
fDate :
Dec. 2014
Firstpage :
767
Lastpage :
778
Abstract :
This study demonstrates that robots can achieve socially acceptable interactions using loosely synchronized head gaze-speech acts. Prior approaches use tightly synchronized head gaze-speech, which requires significant human effort and time to manually annotate synchronization events in advance, restricts interactive dialog, or requires that the operator acts as a puppeteer. This paper describes how autonomous synchronization of head gaze can be achieved by exploiting affordances in the sentence structure and time delays. A 93-participant user study was conducted in a simulated disaster site. The rescue robot “Survivor Buddy” generated head gaze for a victim management scenario using a 911 dialog. The study used pre- and postinteraction questionnaires to compare the social acceptance level of loosely synchronized head gaze-speech against tightly synchronized head gaze-speech (manual annotation) and no head gaze-speech conditions. The results indicated that for attributes of Self-Assessment Manikin, i.e., Arousal, Robot Likeability, Human-Like Behavior, Understanding Robot Behavior, Gaze-Speech Synchronization, Looking at Objects at Appropriate Times, and Natural Movement, the loosely synchronized head gaze-speech is similar to tightly synchronized head gaze-speech and preferred to the no head gaze-speech case. This study contributes to a fundamental understanding of the role of social head gaze in social acceptance for human-machine interaction, how social gaze can be produced, and promotes practical implementation in social robots.
Keywords :
human-robot interaction; interactive systems; rescue robots; synchronisation; 911 dialog; Survivor Buddy rescue robot; autonomous head gaze synchronization; human robot interactions; human-machine interaction; loosely synchronized head gaze-speech acts; real-time synthetic speech; self-assessment manikin; social head gaze; social robots; tightly synchronized head gaze-speech; victim management scenario; Human-robot interaction; Interactive systems; Real-time systems; Robot kinematics; Speech; Synchronization; Autonomous generation; human–robot interaction; human???robot interaction; interactive conversation; social head gaze; user study and evaluation;
fLanguage :
English
Journal_Title :
Human-Machine Systems, IEEE Transactions on
Publisher :
ieee
ISSN :
2168-2291
Type :
jour
DOI :
10.1109/THMS.2014.2342035
Filename :
6879322
Link To Document :
بازگشت