Tree-Structured Dilated Convolutional Networks for Image Compressed Sensing

To better recover a sparse image signal carrying redundant information from many fewer measurements than the Nyquist-Shannon sampling theorem suggested, convolutional neural networks (CNNs) can be used to emulate a compressed sensing (CS) process. However, the existing CS methods based on CNNs have...

Full description

Bibliographic Details
Main Authors: Rui Lu, Kuntao Ye
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9889727/
Description
Summary:To better recover a sparse image signal carrying redundant information from many fewer measurements than the Nyquist-Shannon sampling theorem suggested, convolutional neural networks (CNNs) can be used to emulate a compressed sensing (CS) process. However, the existing CS methods based on CNNs have the problems of high computational complexity and unsatisfactory reconstruction effect. This study aims to present a faster algorithm based on CNNs to obtain reconstructed images with finer texture details from CS measurements. A tree-structured dilated conventional network (TDCN) for image CS is proposed. To extract the image multi-scale features as much as possible for better image reconstruction, the TDCN combines tree-structured residual blocks made of three dilation convolution layers with different dilation factors; the output of each dilated convolution layer is directed to fusion layer to eliminate information loss due to the multiple cascading dilated convolutions. Moreover, L1 loss is employed as an objective optimization function instead of L2 loss to improve training results of the network and achieve better convergence. Extensive CS experiments in the study demonstrate that the proposed TDCN outperforms existing state-of-the-art methods in terms of both PSNR and SSIM at different sampling rates while maintaining a fast computational speed. Our code and the trained model are available at <uri>https://github.com/UHADS/TDCN</uri>.
ISSN:2169-3536