Complementary Perception for Handheld SLAM

2018 
We present a novel method for mapping general three-dimensional environments, where sufficient geometric or visual information is not everywhere guaranteed and where the device motion is unconstrained as with handheld systems. The continuous-time simultaneous localization and mapping algorithm integrates a lidar, camera, and inertial measurement unit in a complementary fashion whereby all sensors contribute constraints to the optimization. The proposed algorithm is designed to expand the domain of mappable environments and therefore increase the reliability and utility of general purpose mobile mapping. A key component of the proposed algorithm is the incorporation of depth uncertainty into visual features, which is effective for noisy surfaces and allows features with and without depth estimates to be modeled in a unified manner. Results demonstrate a wider mappable domain on challenging environments compared to the state-of-the-art lidar or vision-based localization and mapping algorithms.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    12
    Citations
    NaN
    KQI
    []