A method for integrating vision and laser range measurements in autonomous ground robotic vehicles

2004 
This paper presents a method to integrate non-stereoscopic vision information with laser distance measurements for Autonomous Ground Robotic Vehicles (AGRV). The method assumes a horizontally-mounted Laser Measurement System (LMS) sweeping 180 degrees in front from right to left every one second, and a video camera mounted five feet high pointing to the front and down at 45 degrees to the horizontal. The LMS gives highly accurate obstacle position measurements in a two-dimensional plane whereas the vision system gives limited and not-so-accurate information on obstacle positions in three dimensions. The vision system can also see contrasts between ground markings. Many AGRVs have similar sensors in similar arrangements. The method presented here is general enough for many types of distance measurements and cameras and lenses. Since the data from these two sensors are in radically different formats, AGRVs need a scheme to combine this data into a common format so that the data can be compared and correlated. Having a successful integration method allows the AGRV to make smart path-finding navigation decisions. Integrating these two sensors is one of the challenges for AGRVs that use this approach. The method presented in this paper employs a geometrical approach to combine the two data sets in real time. Tests, accomplished in simulation as well as on an actual AGRV, show excellent results.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []