• DocumentCode
    3024967
  • Title

    Automated gesture segmentation from dance sequences

  • Author

    Kahol, Kanav ; Tripathi, Priyamvada ; Panchanathan, Sethuraman

  • Author_Institution
    Dept. of Comput. Sci. & Eng., Arizona State Univ., Tempe, AZ, USA
  • fYear
    2004
  • fDate
    17-19 May 2004
  • Firstpage
    883
  • Lastpage
    888
  • Abstract
    Complex human motion (e.g. dance) sequences are typically analyzed by segmenting them into shorter motion sequences, called gestures. However, this segmentation process is subjective, and varies considerably from one choreographer to another. Dance sequences also exhibit a large vocabulary of gestures. In this paper, we propose an algorithm called hierarchical activity segmentation. This algorithm employs a dynamic hierarchical layered structure to represent human anatomy, and uses low-level motion parameters to characterize motion in the various layers of this hierarchy, which correspond to different segments of the human body. This characterization is used with a naive Bayesian classifier to derive choreographer profiles from empirical data that are used to predict how particular choreographers segment gestures in other motion sequences. When the predictions were tested with a library of 45 3D motion capture sequences (with 185 distinct gestures) created by 5 different choreographers, they were found to be 93.3% accurate.
  • Keywords
    Bayes methods; humanities; image motion analysis; image segmentation; image sequences; Bayesian classifier; automated gesture segmentation; dance sequences; hierarchical activity segmentation; hierarchical layered structure; human anatomy; human motion sequences; Bayesian methods; Computer science; Heuristic algorithms; Hidden Markov models; Human anatomy; Libraries; Motion analysis; Testing; Ubiquitous computing; Vocabulary;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Automatic Face and Gesture Recognition, 2004. Proceedings. Sixth IEEE International Conference on
  • Print_ISBN
    0-7695-2122-3
  • Type

    conf

  • DOI
    10.1109/AFGR.2004.1301645
  • Filename
    1301645