A Time-Frequency Network with Channel Attention and Non-Local Modules for Artificial Bandwidth Extension

2020 
Convolution neural networks (CNNs) have been achieving increasing attention for the artificial bandwidth extension (ABE) task recently. However, these methods use the flipped low-frequency phase to reconstruct speech signals, which may lead to the well-known invalid short-time Fourier Transform (STFT) problem. The convolutional operations only enable networks to construct informative features by fusing both channel-wise and spatial information within local receptive fields at each layer. In this paper, we introduce a Time-Frequency Network (TFNet) with channel attention (CA) and non-local (NL) modules for ABE. The TFNet exploits the information from both time and frequency domain branches concurrently to avoid the invalid STFT problem. To capture the channels and space dependencies, we incorporate the CA and NL modules to construct a proposed fully convolutional neural network for the time and frequency branches of TFNet. Experimental results demonstrate that the proposed method outperforms the competing method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    27
    References
    2
    Citations
    NaN
    KQI
    []