• DocumentCode
    2065158
  • Title

    Situated perception of a partner robot based on neuro-fuzzy computing

  • Author

    Kubota, Naoyuki ; Nishida, Kenichiro

  • Author_Institution
    Dept. of Syst. Design, Tokyo Metropolitan Univ., Japan
  • fYear
    2005
  • fDate
    12-15 June 2005
  • Firstpage
    172
  • Lastpage
    177
  • Abstract
    This paper discusses the situated perception of a partner robot interacting with a human. The situated perception is based on the mutual cognitive environment discussed in relevance theory. It is very important to share actual environment in order to realize natural communication, because the meaning of natural language and gesture can be translated according to the surrounding environment. Next, we discuss the role of imitation based on gesture, and explain the method for imitative behavior generation. Finally, we show experimental results of situated perception, imitative behavior generation, and behavior coordination through interaction with a human.
  • Keywords
    fuzzy neural nets; learning (artificial intelligence); robots; behavior coordination; evolutionary computation; human-robot interaction; imitative behavior generation; imitative learning; neuro-fuzzy computing; partner robot; relevance theory; situated perception; spiking neural networks; Charge coupled devices; Cognitive robotics; Human robot interaction; Humanoid robots; Mobile robots; Natural languages; Robot kinematics; Robot sensing systems; Robot vision systems; Tactile sensors;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Advanced Robotics and its Social Impacts, 2005. IEEE Workshop on
  • Print_ISBN
    0-7803-8947-6
  • Type

    conf

  • DOI
    10.1109/ARSO.2005.1511646
  • Filename
    1511646