Title :
Facial expression recognition by using crossing point distribution
Author :
Theekapun, Charoenpong ; Suchada, Tantisatirapong ; Ajaree, Supasuteekul ; Tokai, Shogo ; Hase, Hiroyuki
Author_Institution :
Biomed. Eng. Programme, Srinakharinwirot Univ., Nakonnayok, Thailand
Abstract :
Due to a problem of current research occurring when recognizing facial expressions from a 2.5 D partial face data set taken from any viewpoint ranging from -45deg to +45deg, we propose a novel method for recognizing facial expressions from a 2.5D partial face data set. A 2.5D partial data set is captured from any viewpoint between -45deg and +45deg. The proposed method is developed for subject-independent facial expression recognition. To recognize facial expression, a 3D virtual expression face is reconstructed from a 2.5D partial face data set. A facial expression is then represented in terms of the change of crossing points on a face plane. The face plane is divided into 196 (14times14) region partitions according to a crossing point distribution. Numbers of the crossing point in the 196 region partitions are used for recognition by mean of a support vector machine (SVM). The experiments were done for four facial expressions (neutral, anger, surprise and smiling) of 22 persons. The recognition accuracy is 60.9%.
Keywords :
face recognition; support vector machines; 2.5 D partial face data set; 3D virtual expression face; anger facial expression; crossing point distribution; facial expression recognition; neutral facial expression; smiling facial expression; support vector machine; surprise facial expression; Biomedical engineering; Face recognition; Image recognition; Image reconstruction; Information science; Linear discriminant analysis; Mechanical engineering; Psychology; Support vector machines; Surface reconstruction;
Conference_Titel :
Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology, 2009. ECTI-CON 2009. 6th International Conference on
Conference_Location :
Pattaya, Chonburi
Print_ISBN :
978-1-4244-3387-2
Electronic_ISBN :
978-1-4244-3388-9
DOI :
10.1109/ECTICON.2009.5137224