DocumentCode
2116005
Title
Epipolar geometry based sound localization and extraction for humanoid audition
Author
Nakadai, Kaeuhiro ; Okuno, Hiroshi G. ; Kitano, Hiroaki
Author_Institution
ERATO, Japan Sci. & Technol. Corp, Tokyo, Japan
Volume
3
fYear
2001
fDate
2001
Firstpage
1395
Abstract
Sound localization for a robot or an embedded system is usually solved by using inter-aural phase difference (IPD) and inter-aural intensity difference (IID). These values are calculated by using head-related transfer function (HRTF). However, the HRTF depends on the shape of the head and also on changes of the environments. Therefore, sound localization without HRTF is needed for real-world applications. In this paper, we present a new sound localization method based on auditory epipolar geometry with motion control. The auditory epipolar geometry is an extension of an epipolar geometry in stereo vision to audition, and auditory and visual epipolar geometries can share the sound source direction. The key idea is to exploit additional inputs obtained by the motor control in order to compensate damages in the IPD and IID caused by reverberation of the room and the body of the robot. The proposed system can localize and extract simultaneously two sound sources in a real-world room
Keywords
audio signal processing; computational geometry; mobile robots; motion control; position control; sensor fusion; active audition; auditory epipolar geometry; humanoid robot; motion control; sensor fusion; sound localization; Data mining; Humanoid robots; Information geometry; Intelligent robots; Microphones; Motor drives; Robot sensing systems; Shape; Stereo vision; Symbiosis;
fLanguage
English
Publisher
ieee
Conference_Titel
Intelligent Robots and Systems, 2001. Proceedings. 2001 IEEE/RSJ International Conference on
Conference_Location
Maui, HI
Print_ISBN
0-7803-6612-3
Type
conf
DOI
10.1109/IROS.2001.977176
Filename
977176
Link To Document