An Improved Swin Transformer-Based Model for Remote Sensing Object Detection and Instance Segmentation

Remote sensing image object detection and instance segmentation are widely valued research fields. A convolutional neural network (CNN) has shown defects in the object detection of remote sensing images. In recent years, the number of studies on transformer-based models increased, and these studies...

Ausführliche Beschreibung

Bibliographische Detailangaben
Hauptverfasser: Xiangkai Xu, Zhejun Feng, Changqing Cao, Mengyuan Li, Jin Wu, Zengyan Wu, Yajie Shang, Shubing Ye
Format: Artikel
Sprache:English
Veröffentlicht: MDPI AG 2021-11-01
Schriftenreihe:Remote Sensing
Schlagworte:
Online Zugang:https://www.mdpi.com/2072-4292/13/23/4779
Beschreibung
Zusammenfassung:Remote sensing image object detection and instance segmentation are widely valued research fields. A convolutional neural network (CNN) has shown defects in the object detection of remote sensing images. In recent years, the number of studies on transformer-based models increased, and these studies achieved good results. However, transformers still suffer from poor small object detection and unsatisfactory edge detail segmentation. In order to solve these problems, we improved the Swin transformer based on the advantages of transformers and CNNs, and designed a local perception Swin transformer (LPSW) backbone to enhance the local perception of the network and to improve the detection accuracy of small-scale objects. We also designed a spatial attention interleaved execution cascade (SAIEC) network framework, which helped to strengthen the segmentation accuracy of the network. Due to the lack of remote sensing mask datasets, the MRS-1800 remote sensing mask dataset was created. Finally, we combined the proposed backbone with the new network framework and conducted experiments on this MRS-1800 dataset. Compared with the Swin transformer, the proposed model improved the mask AP by 1.7%, mask AP<sub>S</sub> by 3.6%, AP by 1.1% and AP<sub>S</sub> by 4.6%, demonstrating its effectiveness and feasibility.
ISSN:2072-4292