Image Super-resolution by Residual Attention Network with Multi-skip Connection

Deep convolutional neural networks (Deep CNNs) are difficult to train as they become deeper.Moreover,in image super-resolution,channel-wise features and inputs of the low-resolution (LR) image are treated equally between different channels,resulting in the deficiency of the representational ability...

Full description

Bibliographic Details
Main Author: LIU Zun-xiong, ZHU Cheng-jia, HUANG Ji, CAI Ti-jian
Format: Article
Language:zho
Published: Editorial office of Computer Science 2021-11-01
Series:Jisuanji kexue
Subjects:
Online Access:https://www.jsjkx.com/fileup/1002-137X/PDF/1002-137X-2021-11-258.pdf
_version_ 1818992267760762880
author LIU Zun-xiong, ZHU Cheng-jia, HUANG Ji, CAI Ti-jian
author_facet LIU Zun-xiong, ZHU Cheng-jia, HUANG Ji, CAI Ti-jian
author_sort LIU Zun-xiong, ZHU Cheng-jia, HUANG Ji, CAI Ti-jian
collection DOAJ
description Deep convolutional neural networks (Deep CNNs) are difficult to train as they become deeper.Moreover,in image super-resolution,channel-wise features and inputs of the low-resolution (LR) image are treated equally between different channels,resulting in the deficiency of the representational ability of the CNNs.To resolve these issues,residual attention network with multi-skip Connection (RANMC) is proposed for single-image super resolution (SISR),which employs residual in multi-skip connection (RIMC) structure,then a very deep network is formulated with serval residual groups.Each residual group (RG) contains a certain number of short skip connections (SSC) and multi-skip connections (MC).Based on RIMC,rich low-frequency (LF) information is allowed to be bypassed through multi-skip connection,and high-frequency (HF) information is focused on learning by the principal network.Furthermore,considering interdependencies in channel and spatial dimension,attention mechanism block(AMBlock) is proposed to focus on the location of the information and adaptively readjust channel-wise features,where the spatial attention (SA) mechanism and channel attention (CA) mechanism are taken in the approach.Experiments indicate that RANMC can not only recover image details better,but also obtain higher image quality and network performance.
first_indexed 2024-12-20T20:23:27Z
format Article
id doaj.art-280f8bee521d45ff996fec5254c44fc4
institution Directory Open Access Journal
issn 1002-137X
language zho
last_indexed 2024-12-20T20:23:27Z
publishDate 2021-11-01
publisher Editorial office of Computer Science
record_format Article
series Jisuanji kexue
spelling doaj.art-280f8bee521d45ff996fec5254c44fc42022-12-21T19:27:31ZzhoEditorial office of Computer ScienceJisuanji kexue1002-137X2021-11-01481125826710.11896/jsjkx.201000033Image Super-resolution by Residual Attention Network with Multi-skip ConnectionLIU Zun-xiong, ZHU Cheng-jia, HUANG Ji, CAI Ti-jian0School of Information Engineering,East China Jiaotong University,Nanchang 330013,ChinaDeep convolutional neural networks (Deep CNNs) are difficult to train as they become deeper.Moreover,in image super-resolution,channel-wise features and inputs of the low-resolution (LR) image are treated equally between different channels,resulting in the deficiency of the representational ability of the CNNs.To resolve these issues,residual attention network with multi-skip Connection (RANMC) is proposed for single-image super resolution (SISR),which employs residual in multi-skip connection (RIMC) structure,then a very deep network is formulated with serval residual groups.Each residual group (RG) contains a certain number of short skip connections (SSC) and multi-skip connections (MC).Based on RIMC,rich low-frequency (LF) information is allowed to be bypassed through multi-skip connection,and high-frequency (HF) information is focused on learning by the principal network.Furthermore,considering interdependencies in channel and spatial dimension,attention mechanism block(AMBlock) is proposed to focus on the location of the information and adaptively readjust channel-wise features,where the spatial attention (SA) mechanism and channel attention (CA) mechanism are taken in the approach.Experiments indicate that RANMC can not only recover image details better,but also obtain higher image quality and network performance.https://www.jsjkx.com/fileup/1002-137X/PDF/1002-137X-2021-11-258.pdfimage super-resolution|attention mechanism block|residual network|residual in multi-skip connection|skip connection
spellingShingle LIU Zun-xiong, ZHU Cheng-jia, HUANG Ji, CAI Ti-jian
Image Super-resolution by Residual Attention Network with Multi-skip Connection
Jisuanji kexue
image super-resolution|attention mechanism block|residual network|residual in multi-skip connection|skip connection
title Image Super-resolution by Residual Attention Network with Multi-skip Connection
title_full Image Super-resolution by Residual Attention Network with Multi-skip Connection
title_fullStr Image Super-resolution by Residual Attention Network with Multi-skip Connection
title_full_unstemmed Image Super-resolution by Residual Attention Network with Multi-skip Connection
title_short Image Super-resolution by Residual Attention Network with Multi-skip Connection
title_sort image super resolution by residual attention network with multi skip connection
topic image super-resolution|attention mechanism block|residual network|residual in multi-skip connection|skip connection
url https://www.jsjkx.com/fileup/1002-137X/PDF/1002-137X-2021-11-258.pdf
work_keys_str_mv AT liuzunxiongzhuchengjiahuangjicaitijian imagesuperresolutionbyresidualattentionnetworkwithmultiskipconnection