Title :
Automatic extraction of useful scenario information for dramatic videos
Author_Institution :
Adv. Res. Inst., Inst. for Inf. Ind., Taipei, Taiwan
Abstract :
Aimed at presenting a new video viewing mode, we propose a scheme to extract useful scenario information from dramatic videos automatically. Viewers can easily retrieve video clips what they are interested in by using our method. The proposed method is composed of three procedures, namely face processing, interaction score computing, and scenario information extraction. A challenge to face detection in dramatic video is characters always do not face video camera in full front. Therefore, we employ Lin and Liu´s approach which is capable of dealing with the issues of head-turning and head rotation in face detection. Subsequently, face recognition combines with information of scene in order to speed up recognition procedure. We then identify main character from recognition results, i.e., leading actor and leading actress. The quantitative measure, namely interaction score, is computed to represent degree of interaction between two or among more characters. Then, viewers select the specified characters, and our method collects frames with the selected characters as a video clip. Moreover, interaction score is exploited to construct the interaction graph and phenogram among characters.
Keywords :
feature extraction; graph theory; image sensors; video retrieval; video signal processing; automatic extraction; dramatic videos; face detection; face processing; face recognition; face video camera; interaction graph; interaction score computing; phenogram; scenario information extraction; useful scenario information; video clip retrieval; video viewing mode; Data mining; Face; Face detection; Face recognition; Films; Videos; Visualization; characters interaction; face detection; face recognition; viewing mode; visual graph;
Conference_Titel :
Information, Communications and Signal Processing (ICICS) 2013 9th International Conference on
Conference_Location :
Tainan
Print_ISBN :
978-1-4799-0433-4
DOI :
10.1109/ICICS.2013.6782967