A Versatile Visual Navigation System for Autonomous Vehicles.

2018 
We present a universal visual navigation method which allows a vehicle to autonomously repeat paths previously taught by a human operator. The method is computationally efficient and does not require camera calibration. It can learn and autonomously traverse arbitrarily shaped paths and is robust to appearance changes induced by varying outdoor illumination and naturally-occurring environment changes. The method does not perform explicit position estimation in the 2d/3d space, but it relies on a novel mathematical theorem, which allows fusing exteroceptive and interoceptive sensory data in a way that ensures navigation accuracy and reliability. The experiments performed indicate that the proposed navigation method can accurately guide different autonomous vehicles along the desired path. The presented system, which was already deployed in patrolling scenarios, is provided as open source at www.github.com/gestom/stroll_bearnav.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    38
    References
    2
    Citations
    NaN
    KQI
    []