Swin-Transformer-Based YOLOv5 for Small-Object Detection in Remote Sensing Images

This study aimed to address the problems of low detection accuracy and inaccurate positioning of small-object detection in remote sensing images. An improved architecture based on the Swin Transformer and YOLOv5 is proposed. First, Complete-IOU (<i>CIOU</i>) was introduced to improve the...

Full description

Bibliographic Details
Main Authors: Xuan Cao, Yanwei Zhang, Song Lang, Yan Gong
Format: Article
Language:English
Published: MDPI AG 2023-03-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/23/7/3634
Description
Summary:This study aimed to address the problems of low detection accuracy and inaccurate positioning of small-object detection in remote sensing images. An improved architecture based on the Swin Transformer and YOLOv5 is proposed. First, Complete-IOU (<i>CIOU</i>) was introduced to improve the K-means clustering algorithm, and then an anchor of appropriate size for the dataset was generated. Second, a modified CSPDarknet53 structure combined with Swin Transformer was proposed to retain sufficient global context information and extract more differentiated features through multi-head self-attention. Regarding the path-aggregation neck, a simple and efficient weighted bidirectional feature pyramid network was proposed for effective cross-scale feature fusion. In addition, extra prediction head and new feature fusion layers were added for small objects. Finally, Coordinate Attention (CA) was introduced to the YOLOv5 network to improve the accuracy of small-object features in remote sensing images. Moreover, the effectiveness of the proposed method was demonstrated by several kinds of experiments on the DOTA (Dataset for Object detection in Aerial images). The mean average precision on the DOTA dataset reached 74.7%. Compared with YOLOv5, the proposed method improved the mean average precision (<i>mAP</i>) by 8.9%, which can achieve a higher accuracy of small-object detection in remote sensing images.
ISSN:1424-8220