Recurrent Large Kernel Attention Network for Efficient Single Infrared Image Super-Resolution
Infrared imaging has broad and important applications. However, the infrared detector manufacture technique limits the detector resolution and the resolution of infrared images. In this work, we design a Recurrent Large Kernel Attention Neural Network (RLKA-Net) for single infrared image super-resol...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2024-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10366265/ |
_version_ | 1797365901112115200 |
---|---|
author | Gangping Liu Shuaijun Zhou Xiaxu Chen Wenjie Yue Jun Ke |
author_facet | Gangping Liu Shuaijun Zhou Xiaxu Chen Wenjie Yue Jun Ke |
author_sort | Gangping Liu |
collection | DOAJ |
description | Infrared imaging has broad and important applications. However, the infrared detector manufacture technique limits the detector resolution and the resolution of infrared images. In this work, we design a Recurrent Large Kernel Attention Neural Network (RLKA-Net) for single infrared image super-resolution(SR), and then demonstrate its superior performance. Compared to other SR networks, RLKA-Net is a lightweight network capable of extracting spatial and temporal features from infrared images. To extract spatial features, we use multiple stacked Recurrent Learning Units (RLUs) to expand the network’s receptive field, while the large kernel attention mechanism in RLUs is used to obtain attention maps at various granularity. To extract temporal features, RLKA-Net uses the recurrent learning strategy to keep persistent memory of extracted features, which contribute to more precise reconstruction results. Moreover, RLKA-Net employs an Attention Gate (AG) to reduce the number of parameters and expedite the training process. We demonstrate the efficacy of the Recurrent Learning Stages (RLS), Large Kernel Attention Block (LKAB), and Attention Gate mechanisms through ablation studies. We test RLKA-Net on several infrared image datasets. The experimental results demonstrate that RLKA-Net presents state-of-the-art performance compared to existing SR models. The code and models are available at <uri>https://github.com/ZedFm/</uri> RLKA-Net. |
first_indexed | 2024-03-08T16:56:31Z |
format | Article |
id | doaj.art-146f6bf85c4d47da95c6545d03d394fd |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-03-08T16:56:31Z |
publishDate | 2024-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-146f6bf85c4d47da95c6545d03d394fd2024-01-05T00:03:17ZengIEEEIEEE Access2169-35362024-01-011292393510.1109/ACCESS.2023.334483010366265Recurrent Large Kernel Attention Network for Efficient Single Infrared Image Super-ResolutionGangping Liu0https://orcid.org/0000-0003-0103-0917Shuaijun Zhou1Xiaxu Chen2https://orcid.org/0009-0004-4727-8509Wenjie Yue3Jun Ke4https://orcid.org/0000-0001-6027-7659School of Optics and Photonics, Beijing Institute of Technology, Beijing, ChinaBeijing Aerospace Automatic Control Institute, China Aerospace Science and Technology Corporation, Beijing, ChinaSchool of Optics and Photonics, Beijing Institute of Technology, Beijing, ChinaSchool of Optics and Photonics, Beijing Institute of Technology, Beijing, ChinaSchool of Optics and Photonics, Beijing Institute of Technology, Beijing, ChinaInfrared imaging has broad and important applications. However, the infrared detector manufacture technique limits the detector resolution and the resolution of infrared images. In this work, we design a Recurrent Large Kernel Attention Neural Network (RLKA-Net) for single infrared image super-resolution(SR), and then demonstrate its superior performance. Compared to other SR networks, RLKA-Net is a lightweight network capable of extracting spatial and temporal features from infrared images. To extract spatial features, we use multiple stacked Recurrent Learning Units (RLUs) to expand the network’s receptive field, while the large kernel attention mechanism in RLUs is used to obtain attention maps at various granularity. To extract temporal features, RLKA-Net uses the recurrent learning strategy to keep persistent memory of extracted features, which contribute to more precise reconstruction results. Moreover, RLKA-Net employs an Attention Gate (AG) to reduce the number of parameters and expedite the training process. We demonstrate the efficacy of the Recurrent Learning Stages (RLS), Large Kernel Attention Block (LKAB), and Attention Gate mechanisms through ablation studies. We test RLKA-Net on several infrared image datasets. The experimental results demonstrate that RLKA-Net presents state-of-the-art performance compared to existing SR models. The code and models are available at <uri>https://github.com/ZedFm/</uri> RLKA-Net.https://ieeexplore.ieee.org/document/10366265/Infrared image super-resolutionimage processingrecurrent neural networkattention mechanism |
spellingShingle | Gangping Liu Shuaijun Zhou Xiaxu Chen Wenjie Yue Jun Ke Recurrent Large Kernel Attention Network for Efficient Single Infrared Image Super-Resolution IEEE Access Infrared image super-resolution image processing recurrent neural network attention mechanism |
title | Recurrent Large Kernel Attention Network for Efficient Single Infrared Image Super-Resolution |
title_full | Recurrent Large Kernel Attention Network for Efficient Single Infrared Image Super-Resolution |
title_fullStr | Recurrent Large Kernel Attention Network for Efficient Single Infrared Image Super-Resolution |
title_full_unstemmed | Recurrent Large Kernel Attention Network for Efficient Single Infrared Image Super-Resolution |
title_short | Recurrent Large Kernel Attention Network for Efficient Single Infrared Image Super-Resolution |
title_sort | recurrent large kernel attention network for efficient single infrared image super resolution |
topic | Infrared image super-resolution image processing recurrent neural network attention mechanism |
url | https://ieeexplore.ieee.org/document/10366265/ |
work_keys_str_mv | AT gangpingliu recurrentlargekernelattentionnetworkforefficientsingleinfraredimagesuperresolution AT shuaijunzhou recurrentlargekernelattentionnetworkforefficientsingleinfraredimagesuperresolution AT xiaxuchen recurrentlargekernelattentionnetworkforefficientsingleinfraredimagesuperresolution AT wenjieyue recurrentlargekernelattentionnetworkforefficientsingleinfraredimagesuperresolution AT junke recurrentlargekernelattentionnetworkforefficientsingleinfraredimagesuperresolution |