Title :
Head gestures for computer control
Author_Institution :
IBM Thomas J. Watson Res. Center, Yorktown Heights, NY, USA
Abstract :
This paper explores the ways in which head gestures can be applied to the user interface. Four categories of gestural task are considered: pointing, continuous control, spatial selection and symbolic selection. For each category, the problem is examined in the abstract, focusing on human factors and an analysis of the task, then solutions are presented which take into consideration sensing constraints and computational efficiency. A hybrid pointer control algorithm is described that is better suited for facial pointing than either pure rate control or pure position control approaches. Variations of the algorithm are described for scrolling and selection tasks. The primary contribution is to address a full range of interactive head gestures using a consistent approach which focuses as much on user and task constraints as on sensing considerations
Keywords :
computer vision; human factors; interactive systems; optical tracking; position control; target tracking; user interfaces; facial pointing; gesture estimation; head gestures; human factors; interactive system; pointer control; position control; user interface; Cameras; Computational efficiency; Computer interfaces; Feedback; Head; Human computer interaction; Human factors; Pervasive computing; Position control; User interfaces;
Conference_Titel :
Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems, 2001. Proceedings. IEEE ICCV Workshop on
Conference_Location :
Vancouver, BC
Print_ISBN :
0-7695-1074-4
DOI :
10.1109/RATFG.2001.938911