Relative pose estimation for vision-based UAV vertically landing on the ship

2018 
A method based on monocular vision guidance of autonomously and vertically landing on the ship is proposed aiming to estimate the relative pose information between the UAV and ship and to provide the important information for the follow-up flight control system. This paper mainly involves corner points extraction in the region of interest (ROI) and pose estimation based on corner points. Through collecting the self-designed feature plane images, the method extracts the target points of the ROI by using HSV space and invariant moments. Then the relative pose information based on the principle of homography matrix is estimated by the information of points. In the measurement experiment, the rotation of rotary table imitates the movement of ship. The relative pose information is estimated by extracting 12 corner points with the 800 × 600-pixel monocular camera. Experiments show that the method can achieve maximum 1.7° relative posture measurement error and 1.4% relative position measurement error. This method has good robustness and high accuracy.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    5
    References
    1
    Citations
    NaN
    KQI
    []