CosySLAM: tracking contact features using visual-inertial object-level SLAM for locomotion

2021 
A legged robot is equipped with several sensors observing different classes of information, in order to provide various estimates on its states and its environment. While state estimation and mapping in this domain have traditionally been investigated through multiple local filters, recent progresses have been made toward tightly-coupled estimation. Multiple observations are then merged into an a-posteriori maximum estimating several quantities that otherwise were separately estimated. With this paper, our goal is to move one step further, by leveraging on object-based simultaneous localization and mapping. We use an object pose estimator to localize the relative placement of the robot with respect to large elements of the environments, e.g. stair steps. These measurements are merged with other typical observations of legged robots, e.g. inertial measurements, to provide an estimation of the robot state (position, orientation and velocity of the basis) along with an accurate estimation of the environment pieces. It then provides a consistent estimation of these two quantities, which is an important property as both would be needed to control the robot locomotion. We provide a complete implementation of this idea with the object tracker CosyPose, which we trained on our environment and for which we provide a covariance model, and with the SLAM engine Wolf used as a visual-inertial estimator on the quadruped robot Solo.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []