Movement Direction Estimation Using Omnidirectional Images in a SLAM Algorithm

2017 
This work presents a method to estimate the movement direction of a mobile robot using only visual information, without any other additional sensor. This visual information is provided by a catadioptric system mounted on the robot and formed by a camera pointing towards a convex mirror. It provides the robot with omnidirectional images that contain information with a field of view of 360\(^\circ \) around the camera-mirror axis. A SLAM algorithm is presented to test the method that estimates the movement direction of the robot. This SLAM method uses two different global appearance descriptors to calculate the orientation of the robot and the distance between two different positions. The method to calculate the movement direction is based on landmarks extraction, using SURF features. A set of omnidirectional images has been considered to test the effectiveness of this method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    19
    References
    0
    Citations
    NaN
    KQI
    []