Title :
Affect detection from body language during social HRI
Author :
McColl, Derek ; Nejat, Goldie
Author_Institution :
Dept. of Mech. & Ind. Eng., Univ. of Toronto, Toronto, ON, Canada
Abstract :
In order for robots to effectively engage a person in bi-directional social human-robot interaction (HRI), they need to be able to perceive and respond appropriately to a person´s affective state. It has been shown that body language is essential in effectively communicating human affect. In this paper, we present an automated real-time body language recognition and classification system, utilizing the Microsoft® Kinect™ sensor, that determines a person´s affect in terms of their accessibility (i.e., openness and rapport) towards a robot during natural one-on-one interactions. Social HRI experiments are presented with our human-like robot Brian 2.0 and a comparison study between our proposed system and one developed with the Kinect™ body pose estimation algorithm verifies the performance of our affect classification system in HRI scenarios.
Keywords :
gesture recognition; human-robot interaction; infrared detectors; pattern classification; pose estimation; psychology; Kinect™ body pose estimation algorithm; Microsoft® Kinect™ sensor; affect classification system; automated real-time body language classification system; automated real-time body language recognition system; bidirectional social human-robot interaction; human-like robot Brian 2.0; social HRI experiments; Ellipsoids; Estimation; Head; Humans; Robot sensing systems; Skin;
Conference_Titel :
RO-MAN, 2012 IEEE
Conference_Location :
Paris
Print_ISBN :
978-1-4673-4604-7
Electronic_ISBN :
1944-9445
DOI :
10.1109/ROMAN.2012.6343882