Multiscale Progressive Segmentation Network for High-Resolution Remote Sensing Imagery

2022 
Semantic segmentation of high-resolution remote sensing imageries (HRSIs) is a critical task for a wide range of applications, such as precision agriculture and urban planning. Although convolutional neural networks (CNNs) have made great progress in accomplishing this task recently, there still exist some challenges to address, one of which is simultaneously segmenting objects with large-scale variations in a HRSI. Targeting at this challenge, previous CNNs often adopt multiple convolution kernels in one layer or skip-layer connections between different layers to extract multiscale representations. However, due to the limited learning capacity of each CNN, it tends to make tradeoffs in segmenting different-scale objects. This would lead to unsatisfactory segmentation results for some objects, especially the small or the large ones. In this article, we propose a multiscale progressive segmentation network to address this issue. Instead of forcing one network to deal with all scales of objects, our network attempts to cascade three subnetworks for gradually segmenting objects into small scale, large scale, and other scale. In order to make the subnetwork focus on the specific scale objects, a scale guidance module is designed. It takes advantage of segmentation results from the preceding subnetwork to guide the feature learning of the succeeding one. Additionally, to acquire the final segmentation results, we propose a position-sensitive module for adaptively combining the outputs of the three subnetworks. This module is capable of assigning combination weights of different subnetworks according to their importance. Experiments on two benchmark datasets named Vaihingen and Potsdam indicate that our proposed network can achieve considerable improvements in comparison with several state-of-the-art segmentation models.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []