• DocumentCode
    2628080
  • Title

    Learning models of speaker head nods with affective information

  • Author

    Lee, Jina ; Prendinger, Helmut ; Neviarouskaya, Alena ; Mars, Stacy

  • Author_Institution
    Univ. of Southern California, Los Angeles, CA, USA
  • fYear
    2009
  • fDate
    10-12 Sept. 2009
  • Firstpage
    1
  • Lastpage
    6
  • Abstract
    During face-to-face conversation, the speaker´s head is continually in motion. These movements serve a variety of important communicative functions, and may also be influenced by our emotions. The goal for this work is to build a domain-independent model of speaker´s head movements and investigate the effect of using affective information during the learning process. Once the model is learned, it can later be used to generate head movements for virtual agents. In this paper, we describe our machine-learning approach to predict speaker´s head nods using an annotated corpora of face-to-face human interaction and emotion labels generated by an affect recognition model. We describe the feature selection process, training process, and the comparison of results of the learned models under varying conditions. The results show that using affective information can help predict head nods better than when no affective information is used.
  • Keywords
    emotion recognition; learning (artificial intelligence); affective information; domain-independent model; face-to-face conversation; face-to-face human interaction; feature selection process; learning models; learning process; machine-learning approach; speaker head nods; training process; virtual agents; Emotion recognition; Face recognition; Humans; Informatics; Information analysis; Machine learning; Magnetic heads; Predictive models; Real time systems; Testing;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Affective Computing and Intelligent Interaction and Workshops, 2009. ACII 2009. 3rd International Conference on
  • Conference_Location
    Amsterdam
  • Print_ISBN
    978-1-4244-4800-5
  • Electronic_ISBN
    978-1-4244-4799-2
  • Type

    conf

  • DOI
    10.1109/ACII.2009.5349543
  • Filename
    5349543