Abstract :
Facial expressions of a person have been developed widely in many applications. Most of them use many complex algorithms, so they need many computing resources for conduction. In order to perform facial expression on resource-limited mobile platform, we determine to develop a system which is low complexity, high efficiency, real-time execution and no prior-training needed. In this paper, lip´s features are applied to classify the human emotion. First, we detect human faces by Haar-like features. Second, the region of mouth is determined by the horizontal projection on the location of face. Third, we determine the corners of lips by using the vertical projection to find the lip boundary. The features extracted are the distance of the mouth´s contour and the difference of gray values between upper lip and the half of mouth´s height. Finally, the analysis method that we adopt is feature-based approach. We attempt to recognize four expressions, neutral, smile, surprise and sadness. The whole system can be conducted in real time about twenty frames per second. The experiment results show that on average the recognition rate is about 85% and thus reveals its efficacy for real world environment.