External-Internal Attention for Hyperspectral Image Super-Resolution

2022 
In recent years, hyperspectral image (HSI) super-resolution (SR) has made significant progress by leveraging convolution neural networks. Existing methods with spectral or spatial attention, which only consider the spectral similarity or pixel-pixel similarity, ignore sample-sample correlations and sparsity. Therefore, based on the fusion of HSI and multispectral image, we propose a new HSI SR model with external-internal attention (EIA). Instead of considering a single sample, external attention module is employed to exploit the incorporating correlations between different samples to get a better feature representation. In addition, an internal attention module based on nonlocal operation is designed to explore the long-range dependencies information. Particularly, oriented to high mapping precision and low computational cost inference, spherical locality sensitive hashing (LSH) is used to divide features into different hash buckets so that every query point is calculated in the hash bucket assigned to it, rather than based a weight sum of features across all positions. The sequential EIA greatly improves the generalization ability and robustness of the model by modeling at the dataset level and at the sample level. Extensive experiments are conducted on five widely used datasets in comparison with state-of-the-art models, demonstrating the advantage of the method we proposed.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    58
    References
    0
    Citations
    NaN
    KQI
    []