Abstract :
Automatic recognition of emotional states is a challenging problem which takes the attention of researchers studying in many areas such as human-computer interaction, verbal-nonverbal communication, data-driven facial animation and analysis of the faces of autistic children. In this work, emotion recognition problem is investigated; a novel approach is proposed that works in real-time and recognizes surprise, anger, happiness, sadness, fear, disgust and neutral expression. In the proposed system, facial landmarks are tracked using the tracker based on the multi-resolution active shape models. Afterwards, the classification is done using the differences among the feature vectors belonging to emotional expressions to the feature vector involving high level features extracted from the locations of the landmarks and the gradients of specific facial regions. In experiments conducted on 5 subjects, success rate of 75.23% is achieved for the seven classes mentioned whereas a success rate of 100% is found when four classes (surprise, anger, happiness and neutral expression) are used. The system gives promising results also on partially occluded faces. The implementation works in real-time and includes modules for person and environment adaptation.
Keywords :
"Emotion recognition","Brain modeling","Conferences","Face recognition","Feature extraction","Argon","Signal processing"