CoDerainNet: Collaborative Deraining Network for Drone-View Object Detection in Rainy Weather Conditions

Benefiting from the advances in object detection in remote sensing, detecting objects in images captured by drones has achieved promising performance in recent years. However, drone-view object detection in rainy weather conditions (Rainy DroneDet) remains a challenge, as small-sized objects blurred...

Full description

Bibliographic Details
Main Authors: Yue Xi, Wenjing Jia, Qiguang Miao, Junmei Feng, Xiangzeng Liu, Fei Li
Format: Article
Language:English
Published: MDPI AG 2023-03-01
Series:Remote Sensing
Subjects:
Online Access:https://www.mdpi.com/2072-4292/15/6/1487
Description
Summary:Benefiting from the advances in object detection in remote sensing, detecting objects in images captured by drones has achieved promising performance in recent years. However, drone-view object detection in rainy weather conditions (Rainy DroneDet) remains a challenge, as small-sized objects blurred by rain streaks offer a little valuable information for robust detection. In this paper, we propose a <b>Co</b>llaborative <b>Derain</b>ing <b>Net</b>work called “CoDerainNet”, which simultaneously and interactively trains a deraining subnetwork and a droneDet subnetwork to improve the accuracy of Rainy DroneDet. Furthermore, we propose a <b>Col</b>laborative <b>Teaching</b> paradigm called “ColTeaching”, which leverages rain-free features extracted by the Deraining Subnetwork and teaches the DroneDet Subnetwork such features, to remove rain-specific interference in features for DroneDet. Due to the lack of an existing dataset for Rainy DroneDet, we built three drone datasets, including two synthetic datasets, namely RainVisdrone and RainUAVDT, and one real drone dataset, called RainDrone. Extensive experiment results on the three rainy datasets show that CoDerainNet can significantly reduce the computational costs of state-of-the-art (SOTA) object detectors while maintaining detection performance comparable to these SOTA models.
ISSN:2072-4292