Tightly coupled fusion of direct stereo visual odometry and inertial sensor measurements using an iterated information filter

2017 
In this paper we describe a recursive filter for the fusion of inertial and visual measurements for self-positioning, where the sensors are attached rigidly to the moving person or object. The system is self-contained, requires no infrastructure and is suitable for both indoor and seamless indoor/outdoor localization. The suggested approach fuses the images acquired by a stereo camera with a 3-axis MEMS accelerometer and gyroscope. We focus on a visual odometry approach in which motion information is calculated from subsequent (stereo) images. The algorithm uses image gradients to determine Jacobian Matrices in an iterated information filter formulation. Therefore after the strapdown inertial navigation (INS) in the prediction step, the correction based on a semi-dense direct image alignment considers properly the state uncertainty and in general only one or two iterations are required for convergence. The accuracy and the robustness of the combined localization system are evaluated using the EuRoCMAV dataset as well as a walking scenario recorded by a custom sensor setup.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    25
    References
    4
    Citations
    NaN
    KQI
    []