Abstract :
Gesture-based interfaces - which let users control devices with, for example, hand or finger motions - are becoming increasingly popular. These interfaces utilize gesture-recognition algorithms to identify body movements. The systems then determine which device command a particular gesture represents and take the appropriate action. For example, moving a hand sideways might mean that a user wants to turn a page on an e-reader screen. Proponents say gesture recognition-which uses computer vision, image processing, and other techniques-is useful largely because it lets people communicate with a machine in a more natural manner, without a mouse or other intermediate device. Although the technology has long been discussed as a potentially useful, rich interface and several gesture-control products have been released over the years, it has never achieved mainstream status.
Keywords :
computer vision; gesture recognition; human computer interaction; body movement identification; computer vision; control devices; finger motions; gestural technology; gesture-based interfaces; gesture-control products; gesture-recognition algorithms; hand motions; image processing; Control systems; Gesture recognition; Eyesight Technologies; GestIC; Leap Motion; Leap Motion Controller; Microchip Technology; Microsoft Kinect; Touch Free; University of Washington; WiSee; gestural interfaces; gesture control; gesture technology; machine vision;