Title :
Mobile robot vision navigation & localization using Gist and Saliency
Author :
Chang, Chin-Kai ; Siagian, Christian ; Itti, Laurent
Author_Institution :
Dept. of Comput. Sci., Univ. of Southern California, Los Angeles, CA, USA
Abstract :
We present a vision-based navigation and localization system using two biologically-inspired scene understanding models which are studied from human visual capabilities: (1) Gist model which captures the holistic characteristics and layout of an image and (2) Saliency model which emulates the visual attention of primates to identify conspicuous regions in the image. Here the localization system utilizes the gist features and salient regions to accurately localize the robot, while the navigation system uses the salient regions to perform visual feedback control to direct its heading and go to a user-provided goal location. We tested the system on our robot, Beobot2.0, in an indoor and outdoor environment with a route length of 36.67m (10,890 video frames) and 138.27m (28,971 frames), respectively. On average, the robot is able to drive within 3.68cm and 8.78cm (respectively) of the center of the lane.
Keywords :
feedback; mobile robots; path planning; robot vision; Beobot2.0; Gist model; Saliency model; biologically-inspired scene understanding models; conspicuous regions; human visual capabilities:; localization system; mobile robot vision navigation; visual feedback control;
Conference_Titel :
Intelligent Robots and Systems (IROS), 2010 IEEE/RSJ International Conference on
Conference_Location :
Taipei
Print_ISBN :
978-1-4244-6674-0
DOI :
10.1109/IROS.2010.5649136