• DocumentCode
    1747555
  • Title

    Dynamic gestures as an input device for directing a mobile platform

  • Author

    Ehreumann, M. ; Lutticke, T. ; Dillmann, R.

  • Author_Institution
    Inst. for Process Control & Robotics, Karlsruhe Univ., Germany
  • Volume
    3
  • fYear
    2001
  • fDate
    2001
  • Firstpage
    2596
  • Abstract
    Giving an advice to a mobile robot still requires classical user interfaces. A more intuitive way of commanding can be provided by verbal or gesture commands. In this article, we present new approaches and enhancements for established methods that are in use in our laboratory. Our aim is to direct a robot with simple dynamic gestures. We focus on visual gesture recognition. Based on skin color segmentation algorithms for tracking the user´s hand, hidden Markov models are used for gesture type recognition. The filters applied to the recorded trajectory strongly compress the input data. They also mark start and end point of a possible gesture. The hidden Markov models have been enhanced by a threshold model in order to wipe out insignificant movements. Pre-classification of the reference gestures serves for keeping computational effort low.
  • Keywords
    gesture recognition; hidden Markov models; image colour analysis; image segmentation; learning systems; mobile robots; robot programming; robot vision; dynamic gestures; hidden Markov models; learning system; mobile robot; robot programming; robot vision; skin color segmentation; user hand tracking; visual gesture recognition; Data gloves; Fingers; Hidden Markov models; Humans; Magnetic sensors; Mobile robots; Process control; Skin; Tracking; User interfaces;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Robotics and Automation, 2001. Proceedings 2001 ICRA. IEEE International Conference on
  • ISSN
    1050-4729
  • Print_ISBN
    0-7803-6576-3
  • Type

    conf

  • DOI
    10.1109/ROBOT.2001.933014
  • Filename
    933014