DocumentCode :
1049323
Title :
Emotion recognition based on physiological changes in music listening
Author :
Jonghwa Kim ; Andre, Elisabeth
Author_Institution :
Inst. fur Inf., Univ. of Augsburg, Augsburg, Germany
Volume :
30
Issue :
12
fYear :
2008
Firstpage :
2067
Lastpage :
2083
Abstract :
Little attention has been paid so far to physiological signals for emotion recognition compared to audiovisual emotion channels such as facial expression or speech. This paper investigates the potential of physiological signals as reliable channels for emotion recognition. All essential stages of an automatic recognition system are discussed, from the recording of a physiological data set to a feature-based multiclass classification. In order to collect a physiological data set from multiple subjects over many weeks, we used a musical induction method that spontaneously leads subjects to real emotional states, without any deliberate laboratory setting. Four-channel biosensors were used to measure electromyogram, electrocardiogram, skin conductivity, and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to find the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by classification results. Classification of four musical emotions (positive/high arousal, negative/high arousal, negative/low arousal, and positive/low arousal) is performed by using an extended linear discriminant analysis (pLDA). Furthermore, by exploiting a dichotomic property of the 2D emotion model, we develop a novel scheme of emotion-specific multilevel dichotomous classification (EMDC) and compare its performance with direct multiclass classification using the pLDA. An improved recognition accuracy of 95 percent and 70 percent for subject-dependent and subject-independent classification, respectively, is achieved by using the EMDC scheme.
Keywords :
biosensors; electrocardiography; electromyography; emotion recognition; entropy; feature extraction; medical signal processing; physiology; signal classification; time-frequency analysis; audiovisual emotion channels; automatic recognition system; biosensors; electrocardiogram; electromyogram; emotion recognition; facial expression; feature-based multiclass classification; features extraction; geometric analysis; multiscale entropy; music listening; musical emotions; musical induction method; physiological changes; physiological signals; respiration changes; skin conductivity; speech expression; subband spectra; time-frequency domain analysis; Audio recording; Biosensors; Conductivity measurement; Disk recording; Emotion recognition; Entropy; Frequency; Laboratories; Skin; Speech; Classifier design and evaluation; Feature evaluation and selection; Human-centered computing; Interaction styles; Methodologies and techniques; Pattern analysis; Robotics; Signal analysis; Signal processing; Theory and methods; User/Machine Systems; and processing; synthesis; Adaptation, Physiological; Algorithms; Arousal; Artificial Intelligence; Auditory Perception; Emotions; Humans; Monitoring, Physiologic; Music; Pattern Recognition, Automated;
fLanguage :
English
Journal_Title :
Pattern Analysis and Machine Intelligence, IEEE Transactions on
Publisher :
ieee
ISSN :
0162-8828
Type :
jour
DOI :
10.1109/TPAMI.2008.26
Filename :
4441720
Link To Document :
بازگشت