DocumentCode :
1737886
Title :
Finding correspondence between visual and auditory events based on perceptual grouping laws across different modalities
Author :
Chen, Jinji ; Mukai, Toshiharu ; Takeuchi, Yoshinori ; Kudo, Hiroaki ; Yamamura, Tsuyoshi ; Ohnishi, Noboru
Author_Institution :
Dept. of Inf. Eng., Nagoya Univ., Japan
Volume :
1
fYear :
2000
fDate :
2000
Firstpage :
242
Abstract :
A human being understands the environment by integrating information obtained by sight, hearing and touch. To integrate information across different senses, a human being must find the correspondence of events observed by different senses. The paper seeks to relate the audio-visual events caused by more than one movement according to general physical laws without object-specific knowledge. As corresponding cues, we use Gestalt´s grouping law; simultaneity of the occurrence of sound and change in movement, similarity of time variation between sound and movement, etc. We conducted experiments in a real environment and obtained satisfactory results showing the effectiveness of the proposed method
Keywords :
sensor fusion; audio-visual event correspondence; auditory events; grouping law; perceptual grouping laws; visual events; Acoustic noise; Acoustic sensors; Auditory system; Control systems; Face recognition; Humans; Image recognition; Legged locomotion; Speech recognition; Working environment noise;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Systems, Man, and Cybernetics, 2000 IEEE International Conference on
Conference_Location :
Nashville, TN
ISSN :
1062-922X
Print_ISBN :
0-7803-6583-6
Type :
conf
DOI :
10.1109/ICSMC.2000.884996
Filename :
884996
Link To Document :
بازگشت