Stereo Visual Odometry and Semantics based Localization of Aerial Robots in Indoor Environments

2018 
In this paper we propose a particle filter localization approach, based on stereo visual odometry (VO) and semantic information from indoor environments, for mini-aerial robots. The prediction stage of the particle filter is performed using the 3D pose of the aerial robot estimated by the stereo VO algorithm. This predicted 3D pose is updated using inertial as well as semantic measurements. The algorithm processes semantic measurements in two phases; firstly, a pre-trained deep learning (DL) based object detector is used for real time object detections in the RGB spectrum. Secondly, from the corresponding 3D point clouds of the detected objects, we segment their dominant horizontal plane and estimate their relative position, also augmenting a prior map with new detections. The augmented map is then used in order to obtain a drift free pose estimate of the aerial robot. We validate our approach in several real flight experiments where we compare it against ground truth and a state of the art visual SLAM approach.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    27
    References
    3
    Citations
    NaN
    KQI
    []