Enhanced Channel Attention Network With Cross-Layer Feature Fusion for Spectral Reconstruction in the Presence of Gaussian Noise

Spectral reconstruction from RGB images has made significant progress. Previous works usually utilized the noise-free RGB images as input to reconstruct the corresponding hyperspectral images (HSIs). However, due to instrumental limitation or atmospheric interference, it is inevitable to suffer from...

Full description

Bibliographic Details
Main Authors: Changwu Zou, Can Zhang, Minghui Wei, Changzhong Zou
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9935112/
Description
Summary:Spectral reconstruction from RGB images has made significant progress. Previous works usually utilized the noise-free RGB images as input to reconstruct the corresponding hyperspectral images (HSIs). However, due to instrumental limitation or atmospheric interference, it is inevitable to suffer from noise (e.g., Gaussian noise) in the actual image acquisition process, which further increases the difficulty of spectral reconstruction. In this article, we propose an enhanced channel attention network (ECANet) to learn a nonlinear mapping from noisy RGB images to clean HSIs. The backbone of our proposed ECANet is stacked with multiple enhanced channel attention (ECA) blocks. The ECA block is the dual residual version of the channel attention block, which makes the network focus on key auxiliary information and features that are more conducive to spectral reconstruction. For the case that the input RGB images are disturbed by Gaussian noise, cross-layer feature fusion unit is used to concatenate the multiple feature maps at different depths for more powerful feature representations. In addition, we design a novel combined loss function as the constraint of the ECANet to achieve more accurate reconstruction result. Experimental results on two HSI benchmarks, CAVE and NTIRE 2020, demonstrate that the effectiveness of our method in terms of both visual and quantitative over other state-of-the-art methods.
ISSN:2151-1535