IdleSR: Efficient Super-Resolution Network with Multi-scale IdleBlocks

2020 
In recent years, deep learning approaches have achieved impressive results in single image super-resolution (SISR). However, most of these models require high computational and memory resources beyond the capability of most mobile and embedded devices. How to significantly reduce the number of operations and parameters while maintaining the performance is a meaningful and challenging problem. To address this problem, we propose an efficient super-resolution network with multi-scale IdleBlocks called IdleSR. Firstly, inspired by information multi-distillation blocks and hybrid composition of IdleBlocks, we construct efficient multi-scale IdleBlocks at the granularity of residual block. Secondly, we replace two 3 \(\times \) 3 kernels in residual blocks by a 5 \(\times \) 1 kernel and a 1 \(\times \) 5 kernel, decreasing parameters and operations dramatically. Thirdly, we use gradient scaling, large input patch size and extra data during training phase to compensate dropped performance. The experiments show that IdleSR can achieve a much better tradeoff among parameter, runtime and performance than start-of-the-art methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []