Deep object segmentation and classification networks for building damage detection using the xBD dataset

ABSTRACTDeep learning has been extensively utilized in the assessment of building damage after disasters. However, the field of building damage segmentation faces challenges, such as misjudged regions, high network complexity, and long running times. Hence, this paper proposes a two-stage building d...

Full description

Bibliographic Details
Main Authors: Zongze Zhao, Fenglei Wang, Shiyu Chen, Hongtao Wang, Gang Cheng
Format: Article
Language:English
Published: Taylor & Francis Group 2024-12-01
Series:International Journal of Digital Earth
Subjects:
Online Access:https://www.tandfonline.com/doi/10.1080/17538947.2024.2302577
Description
Summary:ABSTRACTDeep learning has been extensively utilized in the assessment of building damage after disasters. However, the field of building damage segmentation faces challenges, such as misjudged regions, high network complexity, and long running times. Hence, this paper proposes a two-stage building damage assessment network called the Efficient Channel Attention and Depthwise Separable Convolutional Neural Network (ECADS-CNN). It aims to quickly detect the types of disaster damage in buildings. Deep object segmentation and deep damage classification networks were integrated into a unified building damage detection network. In this study, the efficient channel attention (ECA) module was used to enhance the performance of building semantic segmentation, and a depthwise separable (DS) module was added to the dimension upscaling process. Finally, untrained disaster dataset images were used to test the robustness of the proposed model by comparing the evaluation results of each disaster. The experiments involve testing a total of five common deep learning models, and the results indicate that the ECADS-CNN model improves the speed by 7.4% and the overall F1 score by 5.2% compared with the baseline model. The comprehensive performance is better than that of mainstream deep learning models.
ISSN:1753-8947
1753-8955