Title :
Look where you´re going [robotic wheelchair]
Author :
Kuno, Yoshinori ; Shimada, Nobutaka ; Shirai, Yoshiaki
fDate :
3/1/2003 12:00:00 AM
Abstract :
We propose a robotic wheelchair that observes the user and the environment. It can understand the user´s intentions from his/her behaviors and the environmental information. It also observes the user when he/she is off the wheelchair, recognizing the user´s commands indicated by hand gestures. Experimental results show our approach to be promising. Although the current system uses face direction, for people who find it difficult to move their faces, it can be modified to use the movements of the mouth, eyes, or any other body parts that they can move. Since such movements are generally noisy, the integration of observing the user and the environment will be effective in understanding the real intentions of the user and will be a useful technique for better human interfaces.
Keywords :
face recognition; gesture recognition; handicapped aids; medical robotics; mobile robots; motion control; body part movements; environmental information; eye movements; face direction; face recognition; hand gesture recognition; human interfaces; motion control; mouth movements; robotic wheelchair; user intentions; Cameras; Control systems; Face recognition; Humans; Mobile robots; Navigation; Robot vision systems; Sensor systems; Target recognition; Wheelchairs;
Journal_Title :
Robotics & Automation Magazine, IEEE
DOI :
10.1109/MRA.2003.1191708