Fusion of GNSS and Vision for a Robust Navigation Solution: Concept and Demonstration with COTS Sensors on UAV

2015 
This paper discusses the opportunities proposed by the navigation solution developed by Airbus Defence and Space, relying on multi-sensor fusion in order to enhance or backup GNSS navigation. First, it will mainly focus on the different algorithmic solutions developed for autonomous and relative navigation based on hybridization of 2D images and Inertial Navigation System. Then, it will deal with key situations where robustness to GNSS critical cases is improved through GNSS/INS/Vision hybridization. In particular, real-time performances of the solution, flown on a mini-UAV, will be presented. Initiated by the European Space Agency (ESA) in the late nineties, the vision-based navigation has become a key technology for space exploration missions: it is now seen as key component of the Guidance, Navigation and Control (GNC) system for ESA missions such as Mars & Moon precise landing, in orbit Rendez-Vous and interplanetary navigation. A core of vision-based navigation solution has been matured at Airbus Defence and Space, which is now capable to meet a large set of requirements under different environments, both in space and on Earth. The Vision-Based Navigation (VBN) – mimicking human vision – allows for navigation w.r.t. unknown terrain or objects. It is not related to a specific frame by construction and inherently tends to drift in time. However, since it does not depend on external systems prone to failure, the VBN integrity can be internally assessed. It provides therefore complementary information to RF sensors, as the short term performance of VBN can be fused with the absolute GNSS measurements. This core solution developed at Airbus Defence and Space, relying on fusion of monocular images, patented solution “Ultra Fast Slam” and INS, is complemen-tary with GNSS measurements in many situations, in particular in the urban environment. For example during loss of GNSS visibility or jamming of the receiver, the relative navigation is capable of maintaining the navigation accuracy to nominal level during minutes, while INS-only navigation would quickly drift. Kalman filtering is furthermore able to sort between the diverse positioning measurements and thus can naturally provide detection of spoofing and robustness against incoherent measurements. Local and short-term drifts of the GNSS due to multipath and ionosphere variation are also likely to be estimated, since the drift in the GNSS signals can be corrected with the absence of movement in the camera. Hence GNSS/INS/Vision hybridization is compatible with a broad field of application, like urban canyon positioning and outdoor-indoor smooth transition, and can in general be used in uncertain environments, where GNSS signals are likely to be corrupted. Several UAV flights, under various environment conditions have shown that the combination of all vision-based technologies can improve the navigation performance and enhance robustness to GNSS failures, especially in challenged environments. The hybridization, the flights and the performances are presented in this paper. Application cases of this technology will then be discussed, focussing especially on UAV autonomous navigation and aiding for piloted aviation. The presentation concludes on further hybridization possibilities offered by the vision-based navigation architecture, e.g. LiDAR imaging, radar and barometric altimeters, Doppler measurements.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []