Multi-Resolution Space-Attended Residual Dense Network for Single Image Super-Resolution

With the help of deep convolutional neural networks, a vast majority of single image super-resolution (SISR) methods have been developed, and achieved promising performance. However, these methods suffer from over-smoothness in textured regions due to utilizing a single-resolution network to reconst...

Full description

Bibliographic Details
Main Authors: Jiayv Qin, Xianfang Sun, Yitong Yan, Longcun Jin, Xinyi Peng
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9019616/
Description
Summary:With the help of deep convolutional neural networks, a vast majority of single image super-resolution (SISR) methods have been developed, and achieved promising performance. However, these methods suffer from over-smoothness in textured regions due to utilizing a single-resolution network to reconstruct both the low-frequency and high-frequency information simultaneously. To overcome this problem, we propose a Multi-resolution space-Attended Residual Dense Network (MARDN) to separate low-frequency and high-frequency information for reconstructing high-quality super-resolved images. Specifically, we start from a low-resolution sub-network, and add low-to-high resolution sub-networks step by step in several stages. These sub-networks with different depth and resolution are utilized to produce feature maps of different frequencies in parallel. For instance, the high-resolution sub-network with fewer stages is applied to local high-frequency textured information extraction, while the low-resolution one with more stages is devoted to generating global low-frequency information. Furthermore, the fusion block with channel-wise sub-network attention is proposed for adaptively fusing the feature maps from different sub-networks instead of applying concatenation and $1\times 1$ convolution. A series of ablation investigations and model analyses validate the effectiveness and efficiency of our MARDN. Extensive experiments on benchmark datasets demonstrate the superiority of the proposed MARDN against the state-of-the-art methods. Our super-resolution results and the source code can be downloaded from https://github.com/Periter/MARDN.
ISSN:2169-3536