Adaptively feature matching via joint transformational-spatial clustering

2021 
The transformational and spatial proximities are important cues for identifying inliers from an appearance based match set because correct matches generally stay close in input images and share similar local transformations. However, most existing approaches only check one type of them or both types consecutively with manually set thresholds, and thus their matching accuracy and flexibility in handling large-scale images are limited. In this paper, we present an efficient clustering based approach to identify match inliers with both proximities simultaneously. It first projects the putative matches into a joint transformational-spatial space, where mismatches tend to scatter all around while correct matches gather together. A mode-seeking process based on joint kernel density estimation is then proposed to obtain significant clusters in the joint space, where each cluster contains matches mapping the same object across images with high accuracy. Moreover, kernel bandwidths for measuring match proximities are adaptively set during density estimation, which enhances its applicability for matching different images. Experiments on three standard datasets show that the proposed approach delivers superior performance on a variety of feature matching tasks, including multi-object matching, duplicate object matching and object retrieval.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    46
    References
    0
    Citations
    NaN
    KQI
    []