A Calibration Method for Mobile Omnidirectional Vision Based on Structured Light

2020 
Mobile omnidirectional structured light vision is increasingly used in scene perception and robot navigation. A wide range of information is obtained by means of the vision system by only one image and laser image features are detected and extracted easily and quickly. In this paper a novel calibration method for mobile omnidirectional camera based on structured light is presented. Firstly, a set of parallel laser planes is emitted on the walls of corridor as auxiliary targets by structured light and intersects with wall orthogonally. Secondly, the constraint relationship is analyzed between the vanishing points in fisheye images and intrinsic parameters of imaging model. Finally, effects of the laser stripes’ interval and the angle between the wall which contains laser stripes and ground on calibration results are evaluated. Compared to Scaramuzza method, the calibration method shows its superiority in terms of both feasibility and efficiency. The method with the characteristic of self-calibration since the planar target is replaced by actively projected laser stripes. The result illustrates that our method has the advantages of simple and feasible operation, but result is effective and accurate. The calibration parameters are independent of the laser stripes’ interval and the angle between the wall and ground. Therefore, the method of the mobile omnidirectional structured light vision presented in this paper can be applied to many areas.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    24
    References
    0
    Citations
    NaN
    KQI
    []