• DocumentCode
    2020287
  • Title

    An unsupervised approach for story-related subject caption detection in broadcast news video

  • Author

    Zeng, Zhi ; Li, Heping ; Liang, Wei ; Zhang, Shuwu

  • Author_Institution
    Inst. of Autom., Chinese Acad. of Sci., Beijing, China
  • fYear
    2010
  • fDate
    23-25 Nov. 2010
  • Firstpage
    158
  • Lastpage
    162
  • Abstract
    The story-related subject caption (SSC) in broadcast news video expresses the subject of news story, and plays an important role in news story segmentation and news video indexing. We find that a SSC always has a strip background and all the SSCs in one news video have the same style. By taking advantage of these characters, this paper presents an unsupervised approach to detect SSCs in broadcast news video. We first filter out most of the frames without SSCs by detecting horizontal lines. Secondly, classic text detection technique is utilized to detect captions on frames with horizontal lines. At the same time, spatiotemporal slices processing is employed to track the detected captions and avoid rescanning. Thirdly, all the detected captions from above steps are treated as candidate captions, and clustered by spectral clustering. Finally, according to the caption clusters´ amounts and spanning times, we select one cluster of captions as SSCs. Experimental results show that the proposed approach can detect SSCs in broadcast news video accurately.
  • Keywords
    image segmentation; text analysis; video signal processing; broadcast news video; horizontal lines detection; news story segmentation; news video indexing; spatiotemporal slices; spectral clustering; story-related subject caption detection; text detection technique; Clustering algorithms; Histograms; Image color analysis; Image edge detection; Strips; TV; Visualization;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Audio Language and Image Processing (ICALIP), 2010 International Conference on
  • Conference_Location
    Shanghai
  • Print_ISBN
    978-1-4244-5856-1
  • Type

    conf

  • DOI
    10.1109/ICALIP.2010.5684988
  • Filename
    5684988