Title :
Segmentation and recognition of continuous human activity
Author :
Ali, Anjum ; Aggarwal, J.K.
Author_Institution :
Dept. of Electr. & Comput. Eng., Texas Univ., Austin, TX, USA
Abstract :
This paper presents a methodology for automatic segmentation and recognition of continuous human activity. We segment a continuous human activity into separate actions and correctly identify each action. The camera views the subject from the lateral view: there are no distinct breaks or pauses between the execution of different actions. We have no prior knowledge about the commencement or termination of each action. We compute the angles subtended by three major components of the body with the vertical axis, namely the torso, the upper component of the leg and the lower component of the leg. Using these three angles as a feature vector we classify frames into breakpoint and non-breakpoint frames. Breakpoints indicate an action´s commencement or termination. We use single action sequences for the training data set. The test sequences, on the other hand are continuous sequences of human activity that consist of three or more actions in succession. The system has been tested on continuous activity sequences containing actions such as walking, sitting down, standing up, bending, getting up, squatting and rising. It detects the breakpoints and classifies the actions between them
Keywords :
feature extraction; image classification; image recognition; image segmentation; image sequences; automatic recognition; automatic segmentation; breakpoint frames; camera; continuous human activity recognition; continuous human activity segmentation; feature vector; frame classification; nonbreakpoint frames; single action sequences; test sequences; torso; training data set; Cameras; Computer vision; Forward contracts; Humans; Leg; Legged locomotion; System testing; Torso; Training data;
Conference_Titel :
Detection and Recognition of Events in Video, 2001. Proceedings. IEEE Workshop on
Conference_Location :
Vancouver, BC
Print_ISBN :
0-7695-1293-3
DOI :
10.1109/EVENT.2001.938863