A Fast Feature Tracking Algorithm for Visual Odometry and Mapping Based on RGB-D Sensors

2014 
The recent introduction of low cost sensors such as the Kinect allows the design of real-time applications (i.e. for Robotics) that exploit novel capabilities. One such application is Visual Odometry, a fundamental module of any robotic platform that uses the synchronized color/depth streams captured by these devices to build a map representation of the environment at the same that the robot is localized within the map. Aiming to minimize error accumulation inherent to the process of robot localization, we design a visual feature tracker that works as the front-end of a Visual Odometry system for RGB-D sensors. Feature points are added to the tracker selectively based on pre-specified criteria such as the number of currently active points and their spatial distribution throughout the image. Our proposal is a tracking strategy that allows real-time camera pose computation (average of 24.847 ms per frame) despite the fact that no specialized hardware (such as modern GPUs) is employed. Experiments carried out on publicly available benchmark and datasets demonstrate the usefulness of the method, which achieved RMSE rates superior to the state-of-the-art RGB-D SLAM algorithm.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    28
    References
    2
    Citations
    NaN
    KQI
    []