DocumentCode :
3748883
Title :
FaceDirector: Continuous Control of Facial Performance in Video
Author :
Charles Malleson;Jean-Charles Bazin;Oliver Wang;Derek Bradley;Thabo Beeler;Adrian Hilton;Alexander Sorkine-Hornung
Author_Institution :
Centre for Vision, Univ. of Surrey, Guildford, UK
fYear :
2015
Firstpage :
3979
Lastpage :
3987
Abstract :
We present a method to continuously blend between multiple facial performances of an actor, which can contain different facial expressions or emotional states. As an example, given sad and angry video takes of a scene, our method empowers the movie director to specify arbitrary weighted combinations and smooth transitions between the two takes in post-production. Our contributions include (1) a robust nonlinear audio-visual synchronization technique that exploits complementary properties of audio and visual cues to automatically determine robust, dense spatiotemporal correspondences between takes, and (2) a seamless facial blending approach that provides the director full control to interpolate timing, facial expression, and local appearance, in order to generate novel performances after filming. In contrast to most previous works, our approach operates entirely in image space, avoiding the need of 3D facial reconstruction. We demonstrate that our method can synthesize visually believable performances with applications in emotion transition, performance correction, and timing control.
Keywords :
"Synchronization","Face","Robustness","Three-dimensional displays","Interpolation"
Publisher :
ieee
Conference_Titel :
Computer Vision (ICCV), 2015 IEEE International Conference on
Electronic_ISBN :
2380-7504
Type :
conf
DOI :
10.1109/ICCV.2015.453
Filename :
7410810
Link To Document :
بازگشت