• DocumentCode
    237769
  • Title

    Learning human-like facial expressions for Android Phillip K. Dick

  • Author

    Habib, Ahsan ; Das, Sajal K. ; Bogdan, Ioana-Corina ; Hanson, D. ; Popa, Dan O.

  • Author_Institution
    Electr. Eng. Dept., Univ. of Texas at Arlington, Arlington, CA, USA
  • fYear
    2014
  • fDate
    18-22 Aug. 2014
  • Firstpage
    1159
  • Lastpage
    1165
  • Abstract
    Android robots with human-like appearance can engage in superior conversational interaction with humans due to their ability to generate expressive faces. In this paper we describe a learning algorithm to map the rich facial features of a human to an Android, capable of replicating them. The methodology we employ is automated, marker-less, and has a high degree of generality. As a result, it can be applied to any Android face design, not just those exploiting linear or decoupled relationships between actuator inputs and facial feature point motions. To ensure robust results, we employed a genetic algorithm to search the actuator space and generate a rich set of robot facial expressions. Experimental results clearly show that once learning is complete, accurate facial expressions for the Phillip K. Dick Android can be achieved by mimicking those of a human.
  • Keywords
    actuators; genetic algorithms; human-robot interaction; humanoid robots; Phillip K. Dick android; actuator; android face design; android robots; conversational interaction; expressive face generation; facial feature point motions; genetic algorithm; human-like appearance; human-like facial expression learning; robot facial expressions; Actuators; Androids; Facial features; Humanoid robots; Robot kinematics; Servomotors;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Automation Science and Engineering (CASE), 2014 IEEE International Conference on
  • Conference_Location
    Taipei
  • Type

    conf

  • DOI
    10.1109/CoASE.2014.6899473
  • Filename
    6899473