DocumentCode
3709596
Title
PROBE: Predictive robust estimation for visual-inertial navigation
Author
Valentin Peretroukhin;Lee Clement;Matthew Giamou;Jonathan Kelly
Author_Institution
Institute for Aerospace Studies, University of Toronto, Canada
fYear
2015
Firstpage
3668
Lastpage
3675
Abstract
Navigation in unknown, chaotic environments continues to present a significant challenge for the robotics community. Lighting changes, self-similar textures, motion blur, and moving objects are all considerable stumbling blocks for state-of-the-art vision-based navigation algorithms. In this paper we present a novel technique for improving localization accuracy within a visual-inertial navigation system (VINS). We make use of training data to learn a model for the quality of visual features with respect to localization error in a given environment. This model maps each visual observation from a predefined prediction space of visual-inertial predictors onto a scalar weight, which is then used to scale the observation covariance matrix. In this way, our model can adjust the influence of each observation according to its quality. We discuss our choice of predictors and report substantial reductions in localization error on 4 km of data from the KITTI dataset, as well as on experimental datasets consisting of 700 m of indoor and outdoor driving on a small ground rover equipped with a Skybotix VI-Sensor.
Keywords
"Visualization","Navigation","Probes","Cameras","Robot sensing systems","Robustness","Uncertainty"
Publisher
ieee
Conference_Titel
Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ International Conference on
Type
conf
DOI
10.1109/IROS.2015.7353890
Filename
7353890
Link To Document