Introducing projective transformations into lunar image correspondence for positioning large distance rover
2016
Positioning rovers with a large distance is an important mission of the ground tele-operation center, which can decrease or eliminate the position errors accumulated in continuous measurement and multiple calculations, and facilitate the rover to arrive at faraway scientific probing targets. Currently, the most representative high-accuracy positioning methods are implemented with multi-camera photogrammetric model, which takes the homonymous point pairs extracted from images as constraint points of camera bundles to establish observation equations. The amount, spatial distribution and matching accuracy of homonymous point pairs will affect the effectiveness and accuracy of rover positioning. However, in the case of long-range moving, the images acquired by rover in two positions with a fairly big distance are difficult to match due to existence of large scale and rotation transformations, reflected view of the same scenery and different illumination conditions between acquired images, and a lot of outliers will be generated. In this paper, we introduce projective transformations, which are approximately calculated with imaging relations of two positions, to tackle the outlier elimination problem, and design an iterative algorithm to reduce the outliers and refine the positioning results simultaneously. With this method, the initial approximate positioning information of the rover can be utilized to constrain the range of each feature point projected to another image, and outliers are reduced gradually, preserving almost all the inliers. Finally, several experiments are conducted with lunar surface images acquired by Chang'E-3 rover, which witness the validity of the proposed method.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
9
References
2
Citations
NaN
KQI