• DocumentCode
    76132
  • Title

    Correcting Time-Continuous Emotional Labels by Modeling the Reaction Lag of Evaluators

  • Author

    Mariooryad, Soroosh ; Busso, Carlos

  • Author_Institution
    Erik Jonsson Sch. of Eng. & Comput. Sci., Univ. of Texas at Dallas, Dallas, TX, USA
  • Volume
    6
  • Issue
    2
  • fYear
    2015
  • fDate
    April-June 1 2015
  • Firstpage
    97
  • Lastpage
    108
  • Abstract
    An appealing scheme to characterize expressive behaviors is the use of emotional dimensions such as activation (calm versus active) and valence (negative versus positive). These descriptors offer many advantages to describe the wide spectrum of emotions. Due to the continuous nature of fast-changing expressive vocal and gestural behaviors, it is desirable to continuously track these emotional traces, capturing subtle and localized events (e.g., with FEELTRACE). However, time-continuous annotations introduce challenges that affect the reliability of the labels. In particular, an important issue is the evaluators´ reaction lag caused by observing, appraising, and responding to the expressive behaviors. An empirical analysis demonstrates that this delay varies from 1 to 6 seconds, depending on the annotator, expressive dimension, and actual behaviors. Our experiments show accuracy improvements even with fixed delays (1-3 seconds). This paper proposes to compensate for this reaction lag by finding the time-shift that maximizes the mutual information between the expressive behaviors and the time-continuous annotations. The approach is implemented by making different assumptions about the evaluators´ reaction lag. The benefits of compensating for the delay is demonstrated with emotion classification experiments. On average, the classifiers trained with facial and speech features show more than 7 percent relative improvements over baseline classifiers trained and tested without shifting the time-continuous annotations.
  • Keywords
    emotion recognition; image classification; psychology; delay compensation; emotion classification; emotional dimension; evaluator reaction lag modelling; expressive behavior; time shift; time-continuous annotation; Acoustics; Databases; Delays; Emotion recognition; Feature extraction; Gold; Mutual information; Time-continuous emotion annotation; emotion recognition; emotional descriptors; maximum mutual information;
  • fLanguage
    English
  • Journal_Title
    Affective Computing, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1949-3045
  • Type

    jour

  • DOI
    10.1109/TAFFC.2014.2334294
  • Filename
    6847125