SLAM for Robotic Navigation by Fusing RGB-D and Inertial Data in Recurrent and Convolutional Neural Networks

2019 
Simultaneous localization and mapping (SLAM) is a key component for mobile robot navigation that enables many service robotic applications. The capacity of acquiring accurate 3D-map of an environment is critical for robots to perform various tasks with a high degree of autonomy. Due to the indoor environment complexity and sensor uncertainties, SLAM remains a challenging task in the domain of 3D reconstruction. In this paper, we propose a simple yet effective solution for RGB-D based SLAM by integrating an Inertial Measurement Unit (IMU) into a recurrent and convolutional neural network that leads to enhanced pose estimation and point cloud registration. The IMU data provide an advantage of fast rate inertial data measurement and drift error reduction. Specifically, by imposing additional constraints from the IMU device, an optimal long-short term memory LSTM) is trained to mitigate scale ambiguity thus improve the concatenated ego-motion estimation. Compared to existing SLAM techniques and recent effort of RNN based solutions for 3D reconstruction, we show that our approach is competitive with high accuracy and robustness.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    2
    Citations
    NaN
    KQI
    []