Title :
Multimodal signal fusion
Author :
Sun, Professor Ming-Ting
Author_Institution :
Dept. of Electr. Eng., Univ. of Washington, Seattle, WA, USA
fDate :
June 28 2009-July 3 2009
Abstract :
Our daily life involves multimodal signals (e.g., visual, audio, text, and signals from various sensors). Multimodal signals are highly correlated. For example, researchers have been using audio activities for video summarization of ball games and violence detection in movies. Multimodal signals are complementary to each other. For example, in multimodal surveillance, audio may carry important information not available in video. Usually, people don´t hire a blind or deaf person for surveillance duties, since a human naturally fuses multimodal signals for the best results.
Keywords :
hidden Markov models; learning (artificial intelligence); signal processing; activity recognition system; asynchronous HMM; audio activities; audio sensor; automatic human activity detection; automatic learning; ball games; coupled hidden Markov model; cross-modality correlation; heart-rate sensors; machine learning; motion sensors; multimedia applications; multimodal representation approach; multimodal sensors; multimodal signal fusion; multimodal signal processing; multimodal surveillance; semantic concepts; sleep monitoring; supervised learning; video summarization; violence detection; visual concepts; Circuits and systems; Hidden Markov models; Humans; Labeling; Multimodal sensors; Sensor systems; Sun; Supervised learning; Surveillance; Training data;
Conference_Titel :
Multimedia and Expo, 2009. ICME 2009. IEEE International Conference on
Conference_Location :
New York, NY
Print_ISBN :
978-1-4244-4290-4
Electronic_ISBN :
1945-7871
DOI :
10.1109/ICME.2009.5202805