Generative Adversarial Differential Analysis for Infrared Small Target Detection

Infrared small target detection refers to the extraction of small targets in a complex, low signal-to-noise ratio background. Depthwise convolution makes it difficult to comprehensively characterize small infrared targets and ignores the importance of image background for the detection task. In this...

Full description

Bibliographic Details
Main Authors: Zongfang Ma, Shuo Pang, Fan Hao
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10460986/
Description
Summary:Infrared small target detection refers to the extraction of small targets in a complex, low signal-to-noise ratio background. Depthwise convolution makes it difficult to comprehensively characterize small infrared targets and ignores the importance of image background for the detection task. In this article, we propose the generative adversarial differential analysis (GADA) model for infrared small target detection, the core of which aims to weaken the reliance on target features and enhance the use of background information. Specifically, we first construct pseudobackground labels by the fast marching method. Then, the background-guided generative adversarial network is used to learn the background data distribution. On this basis, the differential image containing interest regions of small targets is obtained by differential analysis. Finally, the detection results are obtained by performing an elaborate characterization of the interest regions. The effectiveness of GADA is verified with three public datasets. Compared to several state-of-the-art methods, GADA achieves better performance in terms of <inline-formula><tex-math notation="LaTeX">$F1$</tex-math></inline-formula>, <inline-formula><tex-math notation="LaTeX">$IoU$</tex-math></inline-formula>, and <inline-formula><tex-math notation="LaTeX">$AUC$</tex-math></inline-formula>.
ISSN:2151-1535