DocumentCode
2647273
Title
Grouping corresponding parts in vision and audition using perceptual grouping among different sensations
Author
Mukai, Toshiharu ; Ohnishi, Noboru
Author_Institution
RIKEN, Inst. of Phys. & Chem. Res., Nagoya, Japan
fYear
1996
fDate
8-11 Dec 1996
Firstpage
713
Lastpage
718
Abstract
In sensor fusion, output from various sensors is fused to obtain better information. In those systems, it is implicitly assumed that output from various sensors has originated from the same object. However, if sensors are mounted on a system without intension, the assumption is not feasible. In many sensor fusion systems, designers set up sensors to obtain output originated from the same object. In the present paper, in order to automatically find relationship among sensors, we propose laws for perceptual organization among different sensations referring to the laws for organization in one sensation in the Gestalt psychology. Then the foundation of the laws is examined in simple experiments. The experimental system consists of vision and audition. It extracts corresponding parts of the two sensation referring to changes of output from each sensation
Keywords
acoustic signal processing; motion estimation; psychology; sensor fusion; Gestalt psychology; audition; parts grouping; perceptual grouping; perceptual organization; sensor fusion; vision; Biosensors; Chemical and biological sensors; Chemical sensors; Humans; Machine vision; Pediatrics; Psychology; Robot sensing systems; Sensor fusion; Sensor systems;
fLanguage
English
Publisher
ieee
Conference_Titel
Multisensor Fusion and Integration for Intelligent Systems, 1996. IEEE/SICE/RSJ International Conference on
Conference_Location
Washington, DC
Print_ISBN
0-7803-3700-X
Type
conf
DOI
10.1109/MFI.1996.572307
Filename
572307
Link To Document