DocumentCode
3673369
Title
Visual-auditory saliency detection using event-driven visual sensors
Author
Himanshu Akolkar;David Reverter Valeiras;Ryad Benosman;Chiara Bartolozzi
Author_Institution
iCub Facility, Istituto Italiano di Tecnologia, Genova-16163, Italy
fYear
2015
fDate
6/1/2015 12:00:00 AM
Firstpage
1
Lastpage
6
Abstract
This paper presents a novel architecture for audio-visual saliency detection using event-based visual sensors and traditional microphones installed on the head of a humanoid robot. In the context of collision detection, salient sensory events must be detected at the same time in vision and in the auditory domain. Real collisions in the visual space can be distinguished from fake ones (e.g. due to movements of two objects that occlude each other) because they generate a sound at the time of collision. This temporal coincidence is extremely difficult to detect with frame-based sensors, that intrinsically add a fixed delay in the sensory acquisition or can miss the collision. The high temporal resolution of event-driven vision sensors together with a real time clustering and tracking algorithm allow for the detection of potential collisions with very low latency. Auditory events corresponding to collisions are detected using simple spectral analysis of auditory signals. The visual event can be therefore temporally integrated with coherently occurring auditory events to detect fast-transitions and disentangle real collisions from visual or auditory events that do not correspond to any. The proposed audio-visual collision detection is used in the context of human robot interaction, to detect people clapping in front of the robot and orient its gaze toward the perceived collision.
Keywords
"Visualization","Robot sensing systems","Collision avoidance","Microphones","Noise"
Publisher
ieee
Conference_Titel
Event-based Control, Communication, and Signal Processing (EBCCSP), 2015 International Conference on
Type
conf
DOI
10.1109/EBCCSP.2015.7300674
Filename
7300674
Link To Document