DocumentCode :
3405342
Title :
Affect-expressive hand gestures synthesis and animation
Author :
Bozkurt, Elif ; Erzin, Engin ; Yemez, Yucel
Author_Institution :
Multimedia, Vision & Graphics Lab., Koc Univ., Istanbul, Turkey
fYear :
2015
fDate :
June 29 2015-July 3 2015
Firstpage :
1
Lastpage :
6
Abstract :
Speech and hand gestures form a composite communicative signal that boosts the naturalness and affectiveness of the communication. We present a multimodal framework for joint analysis of continuous affect, speech prosody and hand gestures towards automatic synthesis of realistic hand gestures from spontaneous speech using the hidden semi-Markov models (HSMMs). To the best of our knowledge, this is the first attempt for synthesizing hand gestures using continuous dimensional affect space, i.e., activation, valence, and dominance. We model relationships between acoustic features describing speech prosody and hand gestures with and without using the continuous affect information in speaker independent configurations and evaluate the multimodal analysis framework by generating hand gesture animations, also via objective evaluations. Our experimental studies are promising, conveying the role of affect for modeling the dynamics of speech-gesture relationship.
Keywords :
computer animation; gesture recognition; hidden Markov models; speech synthesis; HSMM; composite communicative signal; hand gesture animation; hand gesture synthesis; hidden semiMarkov model; multimodal analysis framework; speech gesture synthesis; speech prosody; speech-gesture relationship; Animation; Correlation; Feature extraction; Hidden Markov models; Joints; Principal component analysis; Speech; Prosody analysis; continuous affect; gesture animation; hidden semi-Markov models;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Multimedia and Expo (ICME), 2015 IEEE International Conference on
Conference_Location :
Turin
Type :
conf
DOI :
10.1109/ICME.2015.7177478
Filename :
7177478
Link To Document :
بازگشت