Title :
Automatic gaze analysis in multiparty conversations based on Collective First-Person Vision
Author :
Shiro Kumano;Kazuhiro Otsuka;Ryo Ishii;Junji Yamato
Author_Institution :
NTT Communication Science Laboratories, Japan
fDate :
5/1/2015 12:00:00 AM
Abstract :
This paper extends the affective computing research field by introducing first-person vision to automatic conversation analysis. We target medium-sized-party face-to-face conversations where each person wears inward-looking and outward-looking cameras. We demonstrate that the fundamental techniques required for group gaze analysis, i.e. speaker detection, face tracking, and gaze estimation, can be accurately and effectively performed via self-training in a unified framework by gathering captured audio-visual signals to a centralized system and using a general conversation rule, i.e. listeners look mainly at the speaker. We visualize the characteristics of participants´ gaze behavior as a gazee-centered heat map, which quantitatively reveals what parts of the gazee´s body and for how long the participant looked at it while the gazer speaks or listens. An experiment involving two groups of six-person conversations demonstrates the potential of the proposed framework.
Keywords :
"Face","Cameras","Iris","Calibration","Target tracking","Estimation"
Conference_Titel :
Automatic Face and Gesture Recognition (FG), 2015 11th IEEE International Conference and Workshops on
DOI :
10.1109/FG.2015.7284861