DocumentCode
2935240
Title
3D facial expression editing based on the dynamic graph model
Author
Pei, Yuru ; Zha, Hongbin
Author_Institution
Key Lab. of Machine Perception (MOE), Peking Univ., Beijing, China
fYear
2009
fDate
June 28 2009-July 3 2009
Firstpage
1354
Lastpage
1357
Abstract
To model a detailed 3D expressive face based on the limited user constraints is a challenge work. In this paper, we present the facial expression editing technique based on a dynamic graph model. The probabilistic relations between facial expressions and the complex combination of local facial features, as well as the temporal behaviors of facial expressions are represented by the hierarchical dynamic Bayesian network. Given limited user-constraints on the sparse feature mesh, the system can infer the basis expression probabilities, which are used to locate the corresponding expressive mesh in the shape space spanned by the basis models. The experiments demonstrate the 3D dense facial meshes corresponding to the user-constraints can be synthesized effectively.
Keywords
belief networks; graph theory; image processing; 3D facial expression editing; basis expression probabilities; dynamic graph model; hierarchical dynamic Bayesian network; local facial features; sparse feature mesh; Bayesian methods; Context modeling; Facial features; Facial muscles; Hidden Markov models; Humans; Laboratories; Network synthesis; Shape; Speech synthesis; Bayesian network; Expression editing;
fLanguage
English
Publisher
ieee
Conference_Titel
Multimedia and Expo, 2009. ICME 2009. IEEE International Conference on
Conference_Location
New York, NY
ISSN
1945-7871
Print_ISBN
978-1-4244-4290-4
Electronic_ISBN
1945-7871
Type
conf
DOI
10.1109/ICME.2009.5202754
Filename
5202754
Link To Document