DocumentCode :
1207103
Title :
Audio–Visual Active Speaker Tracking in Cluttered Indoors Environments
Author :
Talantzis, Fotios ; Pnevmatikakis, Aristodemos ; Constantinides, Anthony G.
Author_Institution :
Autonomic & Grid Comput. Group, Athens Inf. Technol., Athens
Volume :
38
Issue :
3
fYear :
2008
fDate :
6/1/2008 12:00:00 AM
Firstpage :
799
Lastpage :
807
Abstract :
We propose a system for detecting the active speaker in cluttered and reverberant environments where more than one person speaks and moves. Rather than using only audio information, the system utilizes audiovisual information from multiple acoustic and video sensors that feed separate audio and video tracking modules. The audio module operates using a particle filter (PF) and an information-theoretic framework to provide accurate acoustic source location under reverberant conditions. The video subsystem combines in 3-D a number of 2-D trackers based on a variation of Stauffer´s adaptive background algorithm with spatiotemporal adaptation of the learning parameters and a Kalman tracker in a feedback configuration. Extensive experiments show that gains are to be expected when fusion of the separate modalities is performed to detect the active speaker.
Keywords :
speaker recognition; tracking; audio module; audio-visual active speaker tracking; cluttered indoors environments; particle filter; video tracking modules; Information theory; particle filters (PFs); person tracking; Algorithms; Artificial Intelligence; Biometry; Environment; Image Interpretation, Computer-Assisted; Sound Spectrography; Speech Recognition Software;
fLanguage :
English
Journal_Title :
Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on
Publisher :
ieee
ISSN :
1083-4419
Type :
jour
DOI :
10.1109/TSMCB.2008.922063
Filename :
4505428
Link To Document :
بازگشت