Title :
Utilizing Related Samples to Enhance Interactive Concept-Based Video Search
Author :
Yuan, Jin ; Zha, Zheng-Jun ; Zheng, Yan-Tao ; Wang, Meng ; Zhou, Xiangdong ; Chua, Tat-Seng
Author_Institution :
Sch. of Comput., Nat. Univ. of Singapore, Singapore, Singapore
Abstract :
One of the main challenges in interactive concept-based video search is the problem of insufficient relevant samples, especially for queries with complex semantics. In this paper, “related samples” are exploited to enhance interactive video search. The related samples refer to those video segments that are relevant to part of the query rather than the entire query. Compared to the relevant samples which may be rare, the related samples are usually plentiful and easy to find in search results. Generally, the related samples are visually similar and temporally neighboring to the relevant samples. Based on these two characters, we develop a visual ranking model that simultaneously exploits the relevant, related, and irrelevant samples, as well as a temporal ranking model to leverage the temporal relationship between related and relevant samples. An adaptive fusion method is then proposed to optimally explore these two ranking models to generate search results. We conduct extensive experiments on two real-world video datasets: TRECVID 2008 and YouTube datasets. As the experimental results show, our approach achieves at least 96% and 167% performance improvements against the state-of-the-art approaches on the TRECVID 2008 and YouTube datasets, respectively.
Keywords :
interactive systems; video retrieval; TRECVID 2008; YouTube datasets; adaptive fusion method; interactive concept based video search; query; related sample utilization; temporal ranking model; video segments; Adaptation models; Interactive systems; Search methods; Semantics; Streaming media; YouTube; Complex query; concept-based video search; interactive search; related sample;
Journal_Title :
Multimedia, IEEE Transactions on
DOI :
10.1109/TMM.2011.2168813