‘Featuring’ Optical Rails: View-based robot guidance using orientation features on the sphere

2009 
In this paper, we propose the extension of a view-based method for autonomous track following formerly introduced as Optical Rails by [3, 6], to vector-valued feature images. Instead of gray or color values, the whole analysis works on local orientations, extracted from omnidirectional input images and represented as low-frequency vector-valued view descriptors using spherical harmonics. New signal processing schemes on the sphere are presented for such view descriptors, allowing for efficient approximation, comparison, and differential motion estimation also for incomplete spherical signals. Track following is performed using only visual information, and no auxiliary guidance systems or odometry information is used. We present first results of track following with a mobile robot in an indoor environment, which demonstrates the feasibility of this novel approach.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    14
    References
    3
    Citations
    NaN
    KQI
    []