Style Transformation-Based Change Detection Using Adversarial Learning with Object Boundary Constraints

2021 
Deep learning has shown promising results on change detection (CD) from bi-temporal remote sensing imagery in recent years. However, it still remains challenging to cope with the pseudo-changes caused by seasonal differences and style variations of bi-temporal images. In this paper, an object-level boundary-preserving generative adversarial network (BPGAN) is developed for style transformation-based CD of bi-temporal images. To achieve this purpose, image objects derived in the spectral domain are incorporated into the image translation to generate object-level target-style-like images. In particular, constraints on object boundary consistency and object homogeneity are established in the adversarial learning to maintain the style and content consistency while regularizing the network training. Furthermore, the Superpixel-Based Fast Fuzzy c-Means (SF-FCM) algorithm is utilized for efficient CD from the object-level style-transformed images. Extensive experiments on SPOT5 and GF1 data confirm the effectiveness of the proposed approach.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    6
    References
    0
    Citations
    NaN
    KQI
    []