Effect of Bit Depth on Cloud Segmentation of Remote-Sensing Images

Due to the cloud coverage of remote-sensing images, the ground object information will be attenuated or even lost, and the texture and spectral information of the image will be changed at the same time. Accurately detecting clouds from remote-sensing images is of great significance to the field of r...

Full description

Bibliographic Details
Main Authors: Lingcen Liao, Wei Liu, Shibin Liu
Format: Article
Language:English
Published: MDPI AG 2023-05-01
Series:Remote Sensing
Subjects:
Online Access:https://www.mdpi.com/2072-4292/15/10/2548
_version_ 1797598461600727040
author Lingcen Liao
Wei Liu
Shibin Liu
author_facet Lingcen Liao
Wei Liu
Shibin Liu
author_sort Lingcen Liao
collection DOAJ
description Due to the cloud coverage of remote-sensing images, the ground object information will be attenuated or even lost, and the texture and spectral information of the image will be changed at the same time. Accurately detecting clouds from remote-sensing images is of great significance to the field of remote sensing. Cloud detection utilizes semantic segmentation to classify remote-sensing images at the pixel level. However, previous studies have focused on the improvement of algorithm performance, and little attention has been paid to the impact of bit depth of remote-sensing images on cloud detection. In this paper, the deep semantic segmentation algorithm UNet is taken as an example, and a set of widely used cloud labeling dataset “L8 Biome” is used as the verification data to explore the relationship between bit depth and segmentation accuracy on different surface landscapes when the algorithm is used for cloud detection. The research results show that when the image is normalized, the effect of cloud detection with a 16-bit remote-sensing image is slightly better than that of an 8-bit remote sensing image; when the image is not normalized, the gap will be widened. However, using 16-bit remote-sensing images for training will take longer. This means data selection and classification do not always need to follow the highest possible bit depth when doing cloud detection but should consider the balance of efficiency and accuracy.
first_indexed 2024-03-11T03:21:29Z
format Article
id doaj.art-76adea4551bb4403ae4ef92e98ff6e55
institution Directory Open Access Journal
issn 2072-4292
language English
last_indexed 2024-03-11T03:21:29Z
publishDate 2023-05-01
publisher MDPI AG
record_format Article
series Remote Sensing
spelling doaj.art-76adea4551bb4403ae4ef92e98ff6e552023-11-18T03:06:38ZengMDPI AGRemote Sensing2072-42922023-05-011510254810.3390/rs15102548Effect of Bit Depth on Cloud Segmentation of Remote-Sensing ImagesLingcen Liao0Wei Liu1Shibin Liu2Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100094, ChinaAerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100094, ChinaAerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100094, ChinaDue to the cloud coverage of remote-sensing images, the ground object information will be attenuated or even lost, and the texture and spectral information of the image will be changed at the same time. Accurately detecting clouds from remote-sensing images is of great significance to the field of remote sensing. Cloud detection utilizes semantic segmentation to classify remote-sensing images at the pixel level. However, previous studies have focused on the improvement of algorithm performance, and little attention has been paid to the impact of bit depth of remote-sensing images on cloud detection. In this paper, the deep semantic segmentation algorithm UNet is taken as an example, and a set of widely used cloud labeling dataset “L8 Biome” is used as the verification data to explore the relationship between bit depth and segmentation accuracy on different surface landscapes when the algorithm is used for cloud detection. The research results show that when the image is normalized, the effect of cloud detection with a 16-bit remote-sensing image is slightly better than that of an 8-bit remote sensing image; when the image is not normalized, the gap will be widened. However, using 16-bit remote-sensing images for training will take longer. This means data selection and classification do not always need to follow the highest possible bit depth when doing cloud detection but should consider the balance of efficiency and accuracy.https://www.mdpi.com/2072-4292/15/10/2548bit depthremote sensingsemantic segmentationclouddeep learning
spellingShingle Lingcen Liao
Wei Liu
Shibin Liu
Effect of Bit Depth on Cloud Segmentation of Remote-Sensing Images
Remote Sensing
bit depth
remote sensing
semantic segmentation
cloud
deep learning
title Effect of Bit Depth on Cloud Segmentation of Remote-Sensing Images
title_full Effect of Bit Depth on Cloud Segmentation of Remote-Sensing Images
title_fullStr Effect of Bit Depth on Cloud Segmentation of Remote-Sensing Images
title_full_unstemmed Effect of Bit Depth on Cloud Segmentation of Remote-Sensing Images
title_short Effect of Bit Depth on Cloud Segmentation of Remote-Sensing Images
title_sort effect of bit depth on cloud segmentation of remote sensing images
topic bit depth
remote sensing
semantic segmentation
cloud
deep learning
url https://www.mdpi.com/2072-4292/15/10/2548
work_keys_str_mv AT lingcenliao effectofbitdepthoncloudsegmentationofremotesensingimages
AT weiliu effectofbitdepthoncloudsegmentationofremotesensingimages
AT shibinliu effectofbitdepthoncloudsegmentationofremotesensingimages