DocumentCode
3716975
Title
Efficient movement representation by embedding Dynamic Movement Primitives in deep autoencoders
Author
Nutan Chen;Justin Bayer;Sebastian Urban;Patrick van der Smagt
Author_Institution
Faculty for Informatics, Technische Universit?t M?nchen, 80333 Germany
fYear
2015
Firstpage
434
Lastpage
440
Abstract
Predictive modeling of human or humanoid movement becomes increasingly complex as the dimensionality of those movements grows. Dynamic Movement Primitives (DMP) have been shown to be a powerful method of representing such movements, but do not generalize well when used in configuration or task space. To solve this problem we propose a model called autoencoded dynamic movement primitive (AE-DMP) which uses deep autoencoders to find a representation of movement in a latent feature space, in which DMP can optimally generalize. The architecture embeds DMP into such an autoencoder and allows the whole to be trained as a unit. To further improve the model for multiple movements, sparsity is added for the feature layer neurons; therefore, various movements can be observed clearly in the feature space. After training, the model finds a single hidden neuron from the sparsity that can efficiently generate new movements. Our experiments clearly demonstrate the efficiency of missing data imputation using 50-dimensional human movement data.
Keywords
"Neurons","Feature extraction","Decoding","Training","Noise reduction","Biological neural networks","Trajectory"
Publisher
ieee
Conference_Titel
Humanoid Robots (Humanoids), 2015 IEEE-RAS 15th International Conference on
Type
conf
DOI
10.1109/HUMANOIDS.2015.7363570
Filename
7363570
Link To Document