Title :
Leveraging Kullback–Leibler Divergence Measures and Information-Rich Cues for Speech Summarization
Author :
Lin, Shih-Hsiang ; Yeh, Yaoming ; Chen, Berlin
Author_Institution :
Dept. of Comput. Sci. & Inf. Eng., Nat. Taiwan Normal Univ., Taipei, Taiwan
fDate :
5/1/2011 12:00:00 AM
Abstract :
Imperfect speech recognition often leads to degraded performance when exploiting conventional text-based methods for speech summarization. To alleviate this problem, this paper investigates various ways to robustly represent the recognition hypotheses of spoken documents beyond the top scoring ones. Moreover, a summarization framework, building on the Kullback-Leibler (KL) divergence measure and exploring both the relevance and topical information cues of spoken documents and sentences, is presented to work with such robust representations. Experiments on broadcast news speech summarization tasks appear to demonstrate the utility of the presented approaches.
Keywords :
speech recognition; conventional text-based method; information-rich cue; leveraging Kullback-Leibler divergence measure; speech recognition; speech summarization; spoken documents recognition hypotheses; Kullback–Leibler (KL) -divergence; multiple recognition hypotheses; relevance information; speech summarization; topical information;
Journal_Title :
Audio, Speech, and Language Processing, IEEE Transactions on
DOI :
10.1109/TASL.2010.2066268