Title :
Joint Attention Simulation Using Eye-Tracking and Virtual Humans
Author :
Courgeon, Matthieu ; Rautureau, Gilles ; Martin, Jean-Claude ; Grynszpan, Ouriel
Author_Institution :
Lab.-STICC, Univ. de Bretagne-Sud, Brest, France
fDate :
July-Sept. 1 2014
Abstract :
This article analyses the issues pertaining to the simulation of joint attention with virtual humans. Gaze represents a powerful communication channel illustrated by the pivotal role of joint attention in social interactions. To our knowledge, there have been only few attempts to simulate gazing patterns associated with joint attention as a mean for developing empathic virtual agents. Eye-tracking technologies now enable creating non-invasive gaze-contingent systems that empower the user with the ability to lead a virtual human´s focus of attention in real-time. Although gaze control can be deliberate, most of our visual behaviors in everyday life are not. This article reports empirical data suggesting that users only have partial awareness of controlling gaze-contingent displays. The technical challenges induced by detecting the user´s focus of attention in virtual reality are reviewed and several solutions are compared. We designed and tested a platform for creating virtual humans endowed with the ability to follow the user´s attention. The article discusses the advantages of simulating joint attention for improving interpersonal skills and user engagement. Joint attention plays a major role in the development of autism. The platform we designed is intended for research and treatment of autism and tests included participants with this disorder.
Keywords :
avatars; gaze tracking; human computer interaction; communication channel; eye-tracking; gaze-contingent displays; interpersonal skills; joint attention simulation; user engagement; virtual humans; virtual reality; Autism; Context; Joints; Real-time systems; Shape; Variable speed drives; Visualization; Interaction techniques; evaluation/methodology; handicapped persons/special needs; virtual reality;
Journal_Title :
Affective Computing, IEEE Transactions on
DOI :
10.1109/TAFFC.2014.2335740