DocumentCode :
2556334
Title :
Style-Based Motion Editing
Author :
Jia, Lingtao ; Yang, Yuedong ; Tang, Shaopeng ; Hao, Aimin
Author_Institution :
Beihang Univ., Beijing
fYear :
2007
fDate :
10-12 Dec. 2007
Firstpage :
129
Lastpage :
134
Abstract :
Both motion capture and motion editing are important human motion controlling methods: the former obtains the data captured from the real and natural human joints, and the latter provides the method to edit data. This paper combines the two methods using IK algorithm, and proposes a style-based motion editing approach. During the preprocessing phase, in order to take advantage of the motion capture data, the method performs a PCA on style motion to extract its style, and calculates the key frames of the base motion; then, during the on-line motion editing phase, the method transfers the style extracted from style motion to certain key frames of base motion using IK algorithm, afterwards, interpolates the edited key frames to generate a new motion with the styles of the two motions. The experiment results demonstrate that the style-based motion editing approach has the advantage of better synthesizing performance.
Keywords :
computer animation; feature extraction; image motion analysis; interpolation; principal component analysis; IK algorithm; PCA; human motion controlling methods; key frame interpolation; motion capture data; preprocessing phase; principal component analysis; style extraction; style-based motion editing approach; Data mining; Distributed computing; Gaussian processes; Hidden Markov models; Humans; Independent component analysis; Motion analysis; Motion control; Principal component analysis; Virtual reality;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Digital Media and its Application in Museum & Heritages, Second Workshop on
Conference_Location :
Chongqing
Print_ISBN :
0-7695-3065-6
Type :
conf
DOI :
10.1109/DMAMH.2007.16
Filename :
4414540
Link To Document :
بازگشت