DEDU: Dual-Enhancing Dense-UNet for Lowlight Image Enhancement and Denoise

In this paper, we propose an innovative image enhancement algorithm called “Dual-Enhancing-Dense-UNet (DEDUNet)” that simultaneously performs image brightness enhancement and reduces noise. This model is based on Convolutional Neural Network (CNN) algorithms and incorporates in...

Full description

Bibliographic Details
Main Authors: Hyungjoo Park, Hanseo Lim, Dongyoung Jang
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10418134/
Description
Summary:In this paper, we propose an innovative image enhancement algorithm called “Dual-Enhancing-Dense-UNet (DEDUNet)” that simultaneously performs image brightness enhancement and reduces noise. This model is based on Convolutional Neural Network (CNN) algorithms and incorporates innovative techniques such as Decoupled Fully Connection (DFC) attention, skip connections, shortcut, Cross-Stage-Partial (CSP) and dense blocks to address the brightness enhancement and noise removal aspects of image enhancement. The dual approach to image enhancement offers a new solution for restoring and improving high-quality images, presenting new opportunities in the fields of computer vision and image processing. Our experimental results substantiate the superior performance of the proposed algorithm, showcasing significant improvements in key performance indicators. Specifically, the algorithm achieves a Peak Signal-to-Noise Ratio (PSNR) of 19.17, Structural Similarity Index (SSIM) of 0.71, Learned Perceptual Image Patch Similarity (LPIPS) of 0.30, Mean Absolute Error (MAE) of 0.09, and a Multiply-Accumulate (MAC) of 0.696G. These results highlight the algorithm’s remarkable image quality enhancement capabilities, demonstrating a considerable advantage over existing methods. Experimental results demonstrate the superior performance and efficiency of the proposed algorithm in terms of image quality improvement compared to existing methods.
ISSN:2169-3536