Author/Authors :
Goshvarpour ، Atefeh Department of Biomedical Engineering - Faculty of Electrical Engineering - Sahand University of Technology , Goshvarpour ، Ateke Department of Biomedical Engineering - Imam Reza International University
Abstract :
Designing an automated emotion recognition system using biosignals has become a hot and challenging issue in many fields, including human-computer interferences, robotics, and affective computing. Several algorithms have been proposed to characterize the internal and external behaviors of the subjects in confronting emotional events/stimuli. Eye movements, as an external behavior, are habitually analyzed in a multi-modality system using classic statistical measures, and the evaluation of its dynamics has been neglected so far. Materials and Methods: This experiment intended to provide an innovative single-modality scheme for emotion classification using eye-blinking data. The dynamics of eye-blinking data have been characterized by weighted visibility graph-based indices. The extracted measures were then fed to the different classifiers, including support vector machine, decision tree, k-Nearest neighbor, Adaptive Boosting, and random subset to complete the process of classifying sad, happy, neutral, and fearful affective states. The scheme has been evaluated utilizing the available signals in the SEED-IV database. Results: The proposed framework provided significant performance in terms of recognition rates. The highest average recognition rates of 90% were achieved using the decision tree. Conclusion: In brief, our results showed that eye-blinking data has the potential for emotion recognition. The present system can be extended for designing future affect recognition systems.
Keywords :
Dynamics , Emotion Recognition , Eye , Blinking , Weighted Visibility Graph