• DocumentCode
    2990002
  • Title

    Towards a humanoid museum guide robot that interacts with multiple persons

  • Author

    Bennewitz, Maren ; Faber, Felix ; Joho, Dominik ; Schreiber, Michael ; Behnke, Sven

  • Author_Institution
    Comput. Sci. Inst., Freiburg Univ.
  • fYear
    2005
  • fDate
    5-5 Dec. 2005
  • Firstpage
    418
  • Lastpage
    423
  • Abstract
    The purpose of our research is to develop a humanoid museum guide robot that performs intuitive, multimodal interaction with multiple persons. In this paper, we present a robotic system that makes use of visual perception, sound source localization, and speech recognition to detect, track, and involve multiple persons into interaction. Depending on the audio-visual input, our robot shifts its attention between different persons. In order to direct the attention of its communication partners towards exhibits, our robot performs gestures with its eyes and arms. As we demonstrate in practical experiments, our robot is able to interact with multiple persons in a multimodal way and to shift its attention between different people. Furthermore, we discuss experiences made during a two-day public demonstration of our robot
  • Keywords
    artificial intelligence; gesture recognition; humanoid robots; speech recognition; visual perception; human-robot interaction; humanoid museum guide robot; sound source localization; speech recognition; visual perception; Computer science; Eyes; Facial animation; Human robot interaction; Humanoid robots; Manipulators; Multimodal sensors; Robot sensing systems; Speech recognition; Visual perception;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Humanoid Robots, 2005 5th IEEE-RAS International Conference on
  • Conference_Location
    Tsukuba
  • Print_ISBN
    0-7803-9320-1
  • Type

    conf

  • DOI
    10.1109/ICHR.2005.1573603
  • Filename
    1573603