• DocumentCode
    3709101
  • Title

    Robust visual inertial odometry using a direct EKF-based approach

  • Author

    Michael Bloesch;Sammy Omari;Marco Hutter;Roland Siegwart

  • Author_Institution
    Autonomous Systems Lab, ETH Zü
  • fYear
    2015
  • Firstpage
    298
  • Lastpage
    304
  • Abstract
    In this paper, we present a monocular visual-inertial odometry algorithm which, by directly using pixel intensity errors of image patches, achieves accurate tracking performance while exhibiting a very high level of robustness. After detection, the tracking of the multilevel patch features is closely coupled to the underlying extended Kalman filter (EKF) by directly using the intensity errors as innovation term during the update step. We follow a purely robocentric approach where the location of 3D landmarks are always estimated with respect to the current camera pose. Furthermore, we decompose landmark positions into a bearing vector and a distance parametrization whereby we employ a minimal representation of differences on a corresponding σ-Algebra in order to achieve better consistency and to improve the computational performance. Due to the robocentric, inverse-distance landmark parametrization, the framework does not require any initialization procedure, leading to a truly power-up-and-go state estimation system. The presented approach is successfully evaluated in a set of highly dynamic hand-held experiments as well as directly employed in the control loop of a multirotor unmanned aerial vehicle (UAV).
  • Keywords
    "Feature extraction","Cameras","Three-dimensional displays","Robots","Estimation","Technological innovation","Uncertainty"
  • Publisher
    ieee
  • Conference_Titel
    Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ International Conference on
  • Type

    conf

  • DOI
    10.1109/IROS.2015.7353389
  • Filename
    7353389