Title :
Multi-human spatial social pattern understanding for a multi-modal robot through nonverbal social signals
Author :
Shih-Huan Tseng ; Yuan-Han Hsu ; Yi-Shiu Chiang ; Tung-Yen Wu ; Li-Chen Fu
Author_Institution :
Dept. of Electr. Eng., NTU, Taipei, Taiwan
Abstract :
For service robots to be able to enter a multi-human office environment, it is important to find a group of human users´ social patterns and then to provide a proper service to them in time. Usually, human users´ social patterns are represented in terms of nonverbal social signals. In this paper, a new integrated approach on recognizing multi-human social signals is proposed. Specifically, the nonverbal social signals are detected by a laser range finder and a RGB-D camera and are processed to find the multi-human (spatial) social patterns. Those recognized patterns are then applied to human-to-human, human-to-robot or multi-human-to-robot interactive formation. Experimental results shows that our robot successfully recognizes the aforementioned users´ social patterns followed by appropriate services.
Keywords :
human-robot interaction; image colour analysis; image sensors; laser ranging; robot vision; service robots; RGB-D camera; human-to-human interactive formation; human-to-robot interactive formation; laser range finder; multihuman office environment; multihuman social signals; multihuman spatial social pattern understanding; multihuman-to-robot interactive formation; multimodal robot; nonverbal social signals; service robots; Face; Fuses; Lasers; Robot sensing systems; Skeleton;
Conference_Titel :
Robot and Human Interactive Communication, 2014 RO-MAN: The 23rd IEEE International Symposium on
Conference_Location :
Edinburgh
Print_ISBN :
978-1-4799-6763-6
DOI :
10.1109/ROMAN.2014.6926307