DocumentCode :
1590601
Title :
Biometrics authentication method using lip motion in utterance
Author :
Sayo, Atsushi ; Kajikawa, Yoshinobu ; Muneyasu, Mitsuji
Author_Institution :
Fac. of Eng. Sceince, Kansai Univ., Suita, Japan
fYear :
2011
Firstpage :
1
Lastpage :
5
Abstract :
In this paper, we propose a biometrics authentication method using lip motion in utterance. The proposed authentication method authenticates persons using lip shape (physical trait) and lip motion accompanying utterance (behavioral trait). Hence, the proposed method can be realized with only a camera extracting lip area without the special equipment used in other personal authentication methods and can easily change the registration data. Conventional methods using a lip image use time differential coefficients of horizontal and vertical lip ranges, and their corresponding rates. However, these features cannot describe the lip motion in detail because the extraction of the exact lip shape is difficult. Therefore, the proposed authentication method uses features concerned with lip shape, which are extracted from a divided lip region, and extracts lip motion by dynamic time warping (DTW). Experimental results demonstrate that the proposed method can achieve an authentication rate of more than 99.5 %.
Keywords :
biometrics (access control); image recognition; message authentication; biometrics authentication method; camera extracting lip area; dynamic time warping; exact lip shape; lip motion; personal authentication method; registration data; time differential coefficients; utterance; vertical lip ranges; Accuracy; Authentication; Face; Feature extraction; Image edge detection; Shape; Training;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information, Communications and Signal Processing (ICICS) 2011 8th International Conference on
Conference_Location :
Singapore
Print_ISBN :
978-1-4577-0029-3
Type :
conf
DOI :
10.1109/ICICS.2011.6173131
Filename :
6173131
Link To Document :
بازگشت