Restoration of Single Sand-Dust Image Based on Style Transformation and Unsupervised Adversarial Learning
Since dust particles in the air scatter and absorb light, images captured in sand-dust weather mostly show low contrast, color deviation and blurriness, seriously affecting the reliability of visual tasks. Currently, pixel-level enhancement and prior-based methods are used to restore sand-dust image...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2022-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9862990/ |
_version_ | 1798035249758732288 |
---|---|
author | Bosheng Ding Huimin Chen Lixin Xu Ruiheng Zhang |
author_facet | Bosheng Ding Huimin Chen Lixin Xu Ruiheng Zhang |
author_sort | Bosheng Ding |
collection | DOAJ |
description | Since dust particles in the air scatter and absorb light, images captured in sand-dust weather mostly show low contrast, color deviation and blurriness, seriously affecting the reliability of visual tasks. Currently, pixel-level enhancement and prior-based methods are used to restore sand-dust images. However, these methods cannot accurately extract semantic information from the images due to the loss of information and the complexity of the scene depth, which may lead to color distortion and blurred textures of the restored image. We thus presents a two-stage restoration method based on style transformation and unsupervised sand-dust image restoration network (USDR-Net). In the first stage, the grayscale distribution compensation (GDC) method is used to transform the style of the sand-dust image. After transformation, color shift is eliminated and potential information is restored in the balanced image. In the second stage, USDR-Net firstly employs the dark channel prior and the transmission map enhancement network (TME-Network) to generate and refine the transmission map of the balanced image to improve the accuracy of scene depth. Then, it reconstructs a clear image with actual color and high contrast via adversarial learning with unpaired sand-dust and clear images. Extensive experimental results show that our method outperforms state-of-the-art algorithms based on both qualitative and quantitative evaluations. The mean average precision for the target inspection datasets has increased from 16.79% to 68.82%. |
first_indexed | 2024-04-11T20:55:36Z |
format | Article |
id | doaj.art-5ffaca827797442bab0f410cda74f2e9 |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-04-11T20:55:36Z |
publishDate | 2022-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-5ffaca827797442bab0f410cda74f2e92022-12-22T04:03:41ZengIEEEIEEE Access2169-35362022-01-0110900929010010.1109/ACCESS.2022.32001639862990Restoration of Single Sand-Dust Image Based on Style Transformation and Unsupervised Adversarial LearningBosheng Ding0Huimin Chen1https://orcid.org/0000-0002-1326-9784Lixin Xu2https://orcid.org/0000-0001-8215-6663Ruiheng Zhang3Science and Technology on Electromechanical Dynamic Control Laboratory, Beijing Institute of Technology, Beijing, ChinaScience and Technology on Electromechanical Dynamic Control Laboratory, Beijing Institute of Technology, Beijing, ChinaScience and Technology on Electromechanical Dynamic Control Laboratory, Beijing Institute of Technology, Beijing, ChinaScience and Technology on Electromechanical Dynamic Control Laboratory, Beijing Institute of Technology, Beijing, ChinaSince dust particles in the air scatter and absorb light, images captured in sand-dust weather mostly show low contrast, color deviation and blurriness, seriously affecting the reliability of visual tasks. Currently, pixel-level enhancement and prior-based methods are used to restore sand-dust images. However, these methods cannot accurately extract semantic information from the images due to the loss of information and the complexity of the scene depth, which may lead to color distortion and blurred textures of the restored image. We thus presents a two-stage restoration method based on style transformation and unsupervised sand-dust image restoration network (USDR-Net). In the first stage, the grayscale distribution compensation (GDC) method is used to transform the style of the sand-dust image. After transformation, color shift is eliminated and potential information is restored in the balanced image. In the second stage, USDR-Net firstly employs the dark channel prior and the transmission map enhancement network (TME-Network) to generate and refine the transmission map of the balanced image to improve the accuracy of scene depth. Then, it reconstructs a clear image with actual color and high contrast via adversarial learning with unpaired sand-dust and clear images. Extensive experimental results show that our method outperforms state-of-the-art algorithms based on both qualitative and quantitative evaluations. The mean average precision for the target inspection datasets has increased from 16.79% to 68.82%.https://ieeexplore.ieee.org/document/9862990/Style transformationunsupervised adversarial learningsand-dust image restoration |
spellingShingle | Bosheng Ding Huimin Chen Lixin Xu Ruiheng Zhang Restoration of Single Sand-Dust Image Based on Style Transformation and Unsupervised Adversarial Learning IEEE Access Style transformation unsupervised adversarial learning sand-dust image restoration |
title | Restoration of Single Sand-Dust Image Based on Style Transformation and Unsupervised Adversarial Learning |
title_full | Restoration of Single Sand-Dust Image Based on Style Transformation and Unsupervised Adversarial Learning |
title_fullStr | Restoration of Single Sand-Dust Image Based on Style Transformation and Unsupervised Adversarial Learning |
title_full_unstemmed | Restoration of Single Sand-Dust Image Based on Style Transformation and Unsupervised Adversarial Learning |
title_short | Restoration of Single Sand-Dust Image Based on Style Transformation and Unsupervised Adversarial Learning |
title_sort | restoration of single sand dust image based on style transformation and unsupervised adversarial learning |
topic | Style transformation unsupervised adversarial learning sand-dust image restoration |
url | https://ieeexplore.ieee.org/document/9862990/ |
work_keys_str_mv | AT boshengding restorationofsinglesanddustimagebasedonstyletransformationandunsupervisedadversariallearning AT huiminchen restorationofsinglesanddustimagebasedonstyletransformationandunsupervisedadversariallearning AT lixinxu restorationofsinglesanddustimagebasedonstyletransformationandunsupervisedadversariallearning AT ruihengzhang restorationofsinglesanddustimagebasedonstyletransformationandunsupervisedadversariallearning |