A Spatial-Temporal Attention-Based Method and a New Dataset for Remote Sensing Image Change Detection
Remote sensing image change detection (CD) is done to identify desired significant changes between bitemporal images. Given two co-registered images taken at different times, the illumination variations and misregistration errors overwhelm the real object changes. Exploring the relationships among d...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-05-01
|
Series: | Remote Sensing |
Subjects: | |
Online Access: | https://www.mdpi.com/2072-4292/12/10/1662 |
_version_ | 1797567304840511488 |
---|---|
author | Hao Chen Zhenwei Shi |
author_facet | Hao Chen Zhenwei Shi |
author_sort | Hao Chen |
collection | DOAJ |
description | Remote sensing image change detection (CD) is done to identify desired significant changes between bitemporal images. Given two co-registered images taken at different times, the illumination variations and misregistration errors overwhelm the real object changes. Exploring the relationships among different spatial–temporal pixels may improve the performances of CD methods. In our work, we propose a novel Siamese-based spatial–temporal attention neural network. In contrast to previous methods that separately encode the bitemporal images without referring to any useful spatial–temporal dependency, we design a CD self-attention mechanism to model the spatial–temporal relationships. We integrate a new CD self-attention module in the procedure of feature extraction. Our self-attention module calculates the attention weights between any two pixels at different times and positions and uses them to generate more discriminative features. Considering that the object may have different scales, we partition the image into multi-scale subregions and introduce the self-attention in each subregion. In this way, we could capture spatial–temporal dependencies at various scales, thereby generating better representations to accommodate objects of various sizes. We also introduce a CD dataset LEVIR-CD, which is two orders of magnitude larger than other public datasets of this field. LEVIR-CD consists of a large set of bitemporal Google Earth images, with 637 image pairs (1024 × 1024) and over 31 k independently labeled change instances. Our proposed attention module improves the F1-score of our baseline model from 83.9 to 87.3 with acceptable computational overhead. Experimental results on a public remote sensing image CD dataset show our method outperforms several other state-of-the-art methods. |
first_indexed | 2024-03-10T19:39:53Z |
format | Article |
id | doaj.art-9807b975e4e44baa8ca7b13331feff53 |
institution | Directory Open Access Journal |
issn | 2072-4292 |
language | English |
last_indexed | 2024-03-10T19:39:53Z |
publishDate | 2020-05-01 |
publisher | MDPI AG |
record_format | Article |
series | Remote Sensing |
spelling | doaj.art-9807b975e4e44baa8ca7b13331feff532023-11-20T01:21:00ZengMDPI AGRemote Sensing2072-42922020-05-011210166210.3390/rs12101662A Spatial-Temporal Attention-Based Method and a New Dataset for Remote Sensing Image Change DetectionHao Chen0Zhenwei Shi1Image Processing Center, School of Astronautics, Beihang University, Beijing 100191, ChinaImage Processing Center, School of Astronautics, Beihang University, Beijing 100191, ChinaRemote sensing image change detection (CD) is done to identify desired significant changes between bitemporal images. Given two co-registered images taken at different times, the illumination variations and misregistration errors overwhelm the real object changes. Exploring the relationships among different spatial–temporal pixels may improve the performances of CD methods. In our work, we propose a novel Siamese-based spatial–temporal attention neural network. In contrast to previous methods that separately encode the bitemporal images without referring to any useful spatial–temporal dependency, we design a CD self-attention mechanism to model the spatial–temporal relationships. We integrate a new CD self-attention module in the procedure of feature extraction. Our self-attention module calculates the attention weights between any two pixels at different times and positions and uses them to generate more discriminative features. Considering that the object may have different scales, we partition the image into multi-scale subregions and introduce the self-attention in each subregion. In this way, we could capture spatial–temporal dependencies at various scales, thereby generating better representations to accommodate objects of various sizes. We also introduce a CD dataset LEVIR-CD, which is two orders of magnitude larger than other public datasets of this field. LEVIR-CD consists of a large set of bitemporal Google Earth images, with 637 image pairs (1024 × 1024) and over 31 k independently labeled change instances. Our proposed attention module improves the F1-score of our baseline model from 83.9 to 87.3 with acceptable computational overhead. Experimental results on a public remote sensing image CD dataset show our method outperforms several other state-of-the-art methods.https://www.mdpi.com/2072-4292/12/10/1662image change detectionattention mechanismmulti-scalespatial–temporal dependencyimage change detection datasetfully convolutional networks (FCN) |
spellingShingle | Hao Chen Zhenwei Shi A Spatial-Temporal Attention-Based Method and a New Dataset for Remote Sensing Image Change Detection Remote Sensing image change detection attention mechanism multi-scale spatial–temporal dependency image change detection dataset fully convolutional networks (FCN) |
title | A Spatial-Temporal Attention-Based Method and a New Dataset for Remote Sensing Image Change Detection |
title_full | A Spatial-Temporal Attention-Based Method and a New Dataset for Remote Sensing Image Change Detection |
title_fullStr | A Spatial-Temporal Attention-Based Method and a New Dataset for Remote Sensing Image Change Detection |
title_full_unstemmed | A Spatial-Temporal Attention-Based Method and a New Dataset for Remote Sensing Image Change Detection |
title_short | A Spatial-Temporal Attention-Based Method and a New Dataset for Remote Sensing Image Change Detection |
title_sort | spatial temporal attention based method and a new dataset for remote sensing image change detection |
topic | image change detection attention mechanism multi-scale spatial–temporal dependency image change detection dataset fully convolutional networks (FCN) |
url | https://www.mdpi.com/2072-4292/12/10/1662 |
work_keys_str_mv | AT haochen aspatialtemporalattentionbasedmethodandanewdatasetforremotesensingimagechangedetection AT zhenweishi aspatialtemporalattentionbasedmethodandanewdatasetforremotesensingimagechangedetection AT haochen spatialtemporalattentionbasedmethodandanewdatasetforremotesensingimagechangedetection AT zhenweishi spatialtemporalattentionbasedmethodandanewdatasetforremotesensingimagechangedetection |