SARain-GAN: Spatial Attention Residual UNet Based Conditional Generative Adversarial Network for Rain Streak Removal

Deraining of images plays a pivotal role in computer vision by addressing the challenges posed by rain, enhancing visibility, and refining image quality by eliminating rain streaks. Traditional methods often fall short of effectively handling intricate rain patterns, resulting in incomplete removal....

Full description

Bibliographic Details
Main Authors: Maheshkumar H. Kolekar, Samprit Bose, Abhishek Pai
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10466540/
_version_ 1797236588426559488
author Maheshkumar H. Kolekar
Samprit Bose
Abhishek Pai
author_facet Maheshkumar H. Kolekar
Samprit Bose
Abhishek Pai
author_sort Maheshkumar H. Kolekar
collection DOAJ
description Deraining of images plays a pivotal role in computer vision by addressing the challenges posed by rain, enhancing visibility, and refining image quality by eliminating rain streaks. Traditional methods often fall short of effectively handling intricate rain patterns, resulting in incomplete removal. In this paper, we propose an innovative deep learning-based deraining model leveraging a modified residual UNet and a multiscale attention-guided convolutional neural network module as a discriminator within a conditional generative adversarial network framework. The proposed approach introduces custom hyperparameters and a tailored loss function to facilitate the efficient removal of rain streaks from images. Evaluation on both synthetic and real-world datasets showcases superior performance, as indicated by improved image evaluation metrics such as PSNR, SSIM, and NIQE. The effectiveness of our model extends to improving both rainy and foggy images. We also conducted a comparative analysis of computational complexity in terms of running time, GFLOPs, and no. of parameters against other state-of-the-art methods to demonstrate our model’s superiority.
first_indexed 2024-04-24T17:06:14Z
format Article
id doaj.art-80b12746dbfa4ac68ba06bddb56def20
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-04-24T17:06:14Z
publishDate 2024-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-80b12746dbfa4ac68ba06bddb56def202024-03-28T23:00:26ZengIEEEIEEE Access2169-35362024-01-0112438744388810.1109/ACCESS.2024.337590910466540SARain-GAN: Spatial Attention Residual UNet Based Conditional Generative Adversarial Network for Rain Streak RemovalMaheshkumar H. Kolekar0https://orcid.org/0000-0002-4272-3528Samprit Bose1https://orcid.org/0009-0008-0045-9475Abhishek Pai2https://orcid.org/0009-0008-3201-1181Department of Electrical Engineering, Indian Institute of Technology Patna, Patna, Bihar, IndiaDepartment of Electrical Engineering, Indian Institute of Technology Patna, Patna, Bihar, IndiaDepartment of Information Technology, Bharatiya Vidya Bhavan's Sardar Patel Institute of Technology, Mumbai, Maharashtra, IndiaDeraining of images plays a pivotal role in computer vision by addressing the challenges posed by rain, enhancing visibility, and refining image quality by eliminating rain streaks. Traditional methods often fall short of effectively handling intricate rain patterns, resulting in incomplete removal. In this paper, we propose an innovative deep learning-based deraining model leveraging a modified residual UNet and a multiscale attention-guided convolutional neural network module as a discriminator within a conditional generative adversarial network framework. The proposed approach introduces custom hyperparameters and a tailored loss function to facilitate the efficient removal of rain streaks from images. Evaluation on both synthetic and real-world datasets showcases superior performance, as indicated by improved image evaluation metrics such as PSNR, SSIM, and NIQE. The effectiveness of our model extends to improving both rainy and foggy images. We also conducted a comparative analysis of computational complexity in terms of running time, GFLOPs, and no. of parameters against other state-of-the-art methods to demonstrate our model’s superiority.https://ieeexplore.ieee.org/document/10466540/Image derainingdeep learningresidual UNetfoggy image enhancement
spellingShingle Maheshkumar H. Kolekar
Samprit Bose
Abhishek Pai
SARain-GAN: Spatial Attention Residual UNet Based Conditional Generative Adversarial Network for Rain Streak Removal
IEEE Access
Image deraining
deep learning
residual UNet
foggy image enhancement
title SARain-GAN: Spatial Attention Residual UNet Based Conditional Generative Adversarial Network for Rain Streak Removal
title_full SARain-GAN: Spatial Attention Residual UNet Based Conditional Generative Adversarial Network for Rain Streak Removal
title_fullStr SARain-GAN: Spatial Attention Residual UNet Based Conditional Generative Adversarial Network for Rain Streak Removal
title_full_unstemmed SARain-GAN: Spatial Attention Residual UNet Based Conditional Generative Adversarial Network for Rain Streak Removal
title_short SARain-GAN: Spatial Attention Residual UNet Based Conditional Generative Adversarial Network for Rain Streak Removal
title_sort sarain gan spatial attention residual unet based conditional generative adversarial network for rain streak removal
topic Image deraining
deep learning
residual UNet
foggy image enhancement
url https://ieeexplore.ieee.org/document/10466540/
work_keys_str_mv AT maheshkumarhkolekar sarainganspatialattentionresidualunetbasedconditionalgenerativeadversarialnetworkforrainstreakremoval
AT sampritbose sarainganspatialattentionresidualunetbasedconditionalgenerativeadversarialnetworkforrainstreakremoval
AT abhishekpai sarainganspatialattentionresidualunetbasedconditionalgenerativeadversarialnetworkforrainstreakremoval