• DocumentCode
    106370
  • Title

    Accurate Human Navigation Using Wearable Monocular Visual and Inertial Sensors

  • Author

    Ya Tian ; Hamel, W.R. ; Jindong Tan

  • Author_Institution
    Sch. of Inf. & Electr. Eng., Shandong Jianzhu Univ., Jinan, China
  • Volume
    63
  • Issue
    1
  • fYear
    2014
  • fDate
    Jan. 2014
  • Firstpage
    203
  • Lastpage
    213
  • Abstract
    This paper presents a novel visual-inertial integration system for human navigation in free-living environments, where the measurements from wearable inertial and monocular visual sensors are integrated. The preestimated orientation, obtained from magnet, angular rate, and gravity sensors, is used to estimate the translation based on the data from the visual and inertial sensors. This has a significant effect on the performance of the fusion sensing strategy and makes the fusion procedure much easier, because the gravitational acceleration can be correctly removed from the accelerometer measurements before the fusion procedure, where a linear Kalman filter is selected as the fusion estimator. Furthermore, the use of preestimated orientation can help to eliminate erroneous point matches based on the properties of the pure camera translation and thus the computational requirements can be significantly reduced compared with the RANdom SAmple Consensus algorithm. In addition, an adaptive-frame rate single camera is selected to not only avoid motion blur based on the angular velocity and acceleration after compensation, but also to make an effect called visual zero-velocity update for the static motion. Thus, it can recover a more accurate baseline and meanwhile reduce the computational requirements. In particular, an absolute scale factor, which is usually lost in monocular camera tracking, can be obtained by introducing it into the estimator. Simulation and experimental results are presented for different environments with different types of movement and the results from a Pioneer robot are used to demonstrate the accuracy of the proposed method.
  • Keywords
    acceleration measurement; accelerometers; adaptive Kalman filters; cameras; image fusion; image sensors; inertial navigation; motion estimation; object tracking; Pioneer robot; absolute scale factor; accelerometer measurement; adaptive frame rate single camera; angular rate; camera translation estimation; free living environment; gravity sensor; human navigation; linear Kalman filter; magnet; monocular camera tracking; preestimated orientation; sensor fusion estimation; static motion; visual inertial integration system; visual zero velocity update; wearable inertial sensor; wearable monocular visual sensor; Acceleration; Cameras; Estimation; Kalman filters; Navigation; Sensors; Visualization; Absolute scale; MARG sensors; indoor positioning; monocular camera; visual-inertial integration;
  • fLanguage
    English
  • Journal_Title
    Instrumentation and Measurement, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9456
  • Type

    jour

  • DOI
    10.1109/TIM.2013.2277514
  • Filename
    6588331