LiDAR - Stereo Camera Fusion for Accurate Depth Estimation

2020 
Dense 3D reconstruction of the surrounding environment is one the fundamental way of perception for Advanced Driver-Assistance Systems (ADAS). In this field, accurate 3D modeling finds applications in many areas like obstacle detection, object tracking, and remote driving. This task can be performed with different sensors like cameras, LiDARs, and radars. Each one presents some advantages and disadvantages based on the precision of the depth, the sensor cost, and the accuracy in adverse weather conditions. For this reason, many researchers have explored the fusion of multiple sources to overcome each sensor limit and provide an accurate representation of the vehicle’s surroundings. This paper proposes a novel post-processing method for accurate depth estimation, based on a patch-wise depth correction approach, to fuse data from LiDAR and stereo camera. This solution allows for accurate edges and object boundaries preservation in multiple challenging scenarios.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    27
    References
    0
    Citations
    NaN
    KQI
    []