A Dual-Branch Deep Learning Architecture for Multisensor and Multitemporal Remote Sensing Semantic Segmentation

Multisensor data analysis allows exploiting heterogeneous data regularly acquired by the many available remote sensing (RS) systems. Machine- and deep-learning methods use the information of heterogeneous sources to improve the results obtained by using single-source data. However, the state-of-the-...

Full description

Bibliographic Details
Main Authors: Luca Bergamasco, Francesca Bovolo, Lorenzo Bruzzone
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10041791/
Description
Summary:Multisensor data analysis allows exploiting heterogeneous data regularly acquired by the many available remote sensing (RS) systems. Machine- and deep-learning methods use the information of heterogeneous sources to improve the results obtained by using single-source data. However, the state-of-the-art methods analyze either the multiscale information of multisensor multiresolution images or the time component of image time series. We propose a supervised deep-learning classification method that jointly performs a multiscale and multitemporal analysis of RS multitemporal images acquired by different sensors. The proposed method processes very-high-resolution (VHR) images using a residual network with a wide receptive field that handles geometrical details and multitemporal high-resolution (HR) image using a 3-D convolutional neural network that analyzes both the spatial and temporal information. The multiscale and multitemporal features are processed together in a decoder to retrieve a land-cover map. We tested the proposed method on two multisensor and multitemporal datasets. One is composed of VHR orthophotos and Sentinel-2 multitemporal images for pasture classification, and another is composed of VHR orthophotos and Sentinel-1 multitemporal images. Results proved the effectiveness of the proposed classification method.
ISSN:2151-1535