Robust Visual Localization for Hopping Rovers on Small Bodies

2018 
We present a collaborative visual localization method for rovers designed to hop and tumble across the surface of small Solar System bodies, such as comets and asteroids. In a two-phase approach, an orbiting primary spacecraft first maps the surface of a body by capturing images from various poses and illumination angles; these images are processed to create a prior map of 3D landmarks. In the second phase, a hopping rover is deployed to the surface where it uses a camera to relocalize to the prior map and to perform onboard visual simultaneous localization and mapping (SLAM). Small bodies present several unique challenges to existing visual SLAM algorithms, such as high-contrast shadows that move quickly over the surface due to the short (e.g. 1–12 hour) rotational periods, and large changes in visual appearance between orbit and the surface, where image scale varies by many orders of magnitude (kilometers to centimeters). In this work, we describe how to augment ORB-SLAM2—a state of the art visual SLAM implementation—to handle large variations in illumination by fusing prior images with varying illumination angles. We demonstrate how a hopping rover can use a wide field of view (FOV) camera to relocalize to prior maps captured by an orbiting spacecraft with a narrow FOV camera, and how the growth of pose and scale errors can be bounded by periodic loop closures during large hops. The proposed method is evaluated with sequences of images captured around a mock asteroid; it is shown to be robust to varying illumination angles, scene scale changes, and off-nadir camera pointing angles.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    13
    References
    3
    Citations
    NaN
    KQI
    []