Deep Learning Approach Based on Tensor-Train for Sparse Signal Recovery

Compressive sensing is a desirable technique to acquire and reconstruct signals at sub-Nyquist rates. Recently, several deep learning-based studies on solving the compressive sensing problem have been carried out, which dramatically reduce the intensive computational complexity of the traditional gr...

Full description

Bibliographic Details
Main Authors: Cong Zou, Fang Yang
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8663363/
Description
Summary:Compressive sensing is a desirable technique to acquire and reconstruct signals at sub-Nyquist rates. Recently, several deep learning-based studies on solving the compressive sensing problem have been carried out, which dramatically reduce the intensive computational complexity of the traditional greedy or convex recovery algorithms and even improve the signal recovery performance. However, as the signal size increases, most of these methods recover signals block by block due to the large computational complexity and memory consumption, which usually imposes block effect on the recovered signals. To deal with this issue, in this paper, we apply a tensor decomposition method named Tensor-Train (TT) on the neural network, by which the number of parameters is reduced by a tremendous factor and the computational complexity is further decreased so that the large signals can be recovered as a whole. In particular, the TT-decomposition is jointly applied on a stacked denoising autoencoder (SDA) network called TT-SDA in this paper. The experiments indicate that the proposed TT-SDA network can preserve the reconstruction performance of the conventional SDA network and outperform the traditional methods, especially with low measurement rates. Meanwhile, it can also significantly reduce the computational complexity and occupied memory space, which becomes a time and memory efficient method in compressive sensing problem.
ISSN:2169-3536