Multi-scale attention network for image super-resolution

2021 
Abstract The power of convolutional neural networks (CNN) has demonstrated irreplaceable advantages in super-resolution. However, many CNN-based methods need large model sizes to achieve superior performance, making them difficult to apply in the practical world with limited memory footprints. To efficiently balance model complexity and performance, we propose a multi-scale attention network (MSAN) by cascading multiple multi-scale attention blocks (MSAB), each of which integrates a multi-scale cross block (MSCB) and a multi-path wide-activated attention block (MWAB). Specifically, MSCB initially connects three parallel convolutions with different dilation rates hierarchically to aggregate the knowledge of features at different levels and scales. Then, MWAB split the channel features from MSCB into three portions to further improve performance. Rather than being treated equally and independently, each portion is responsible for a specific function, enabling internal communication among channels. Experimental results show that our MSAN outperforms most state-of-the-art methods with relatively few parameters and Mult-Adds.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    46
    References
    0
    Citations
    NaN
    KQI
    []