Robotic Mapping Approach under Illumination-Variant Environments at Planetary Construction Sites

In planetary construction, the semiautonomous teleoperation of robots is expected to perform complex tasks for site preparation and infrastructure emplacement. A highly detailed 3D map is essential for construction planning and management. However, the planetary surface imposes mapping restrictions...

Full description

Bibliographic Details
Main Authors: Sungchul Hong, Pranjay Shyam, Antyanta Bangunharcana, Hyuseoung Shin
Format: Article
Language:English
Published: MDPI AG 2022-02-01
Series:Remote Sensing
Subjects:
Online Access:https://www.mdpi.com/2072-4292/14/4/1027
_version_ 1797476738588999680
author Sungchul Hong
Pranjay Shyam
Antyanta Bangunharcana
Hyuseoung Shin
author_facet Sungchul Hong
Pranjay Shyam
Antyanta Bangunharcana
Hyuseoung Shin
author_sort Sungchul Hong
collection DOAJ
description In planetary construction, the semiautonomous teleoperation of robots is expected to perform complex tasks for site preparation and infrastructure emplacement. A highly detailed 3D map is essential for construction planning and management. However, the planetary surface imposes mapping restrictions due to rugged and homogeneous terrains. Additionally, changes in illumination conditions cause the mapping result (or 3D point-cloud map) to have inconsistent color properties that hamper the understanding of the topographic properties of a worksite. Therefore, this paper proposes a robotic construction mapping approach robust to illumination-variant environments. The proposed approach leverages a deep learning-based low-light image enhancement (LLIE) method to improve the mapping capabilities of the visual simultaneous localization and mapping (SLAM)-based robotic mapping method. In the experiment, the robotic mapping system in the emulated planetary worksite collected terrain images during the daytime from noon to late afternoon. Two sets of point-cloud maps, which were created from original and enhanced terrain images, were examined for comparison purposes. The experiment results showed that the LLIE method in the robotic mapping method significantly enhanced the brightness, preserving the inherent colors of the original terrain images. The visibility and the overall accuracy of the point-cloud map were consequently increased.
first_indexed 2024-03-09T21:07:57Z
format Article
id doaj.art-343b3ceea5f44e27aefb2e58098c5e3c
institution Directory Open Access Journal
issn 2072-4292
language English
last_indexed 2024-03-09T21:07:57Z
publishDate 2022-02-01
publisher MDPI AG
record_format Article
series Remote Sensing
spelling doaj.art-343b3ceea5f44e27aefb2e58098c5e3c2023-11-23T21:55:49ZengMDPI AGRemote Sensing2072-42922022-02-01144102710.3390/rs14041027Robotic Mapping Approach under Illumination-Variant Environments at Planetary Construction SitesSungchul Hong0Pranjay Shyam1Antyanta Bangunharcana2Hyuseoung Shin3Department of Geoinformatic Engineering, Inha University, Incheon 22212, KoreaDepartment of Mechanical Engineering, Korea Advanced Institute of Science and Technology, College of Engineering, Daejeon 34141, KoreaDepartment of Mechanical Engineering, Korea Advanced Institute of Science and Technology, College of Engineering, Daejeon 34141, KoreaDepartment of Future Technology and Convergence Research, Korea Institute of Civil Engineering and Building Technology, Goyang-si 10223, KoreaIn planetary construction, the semiautonomous teleoperation of robots is expected to perform complex tasks for site preparation and infrastructure emplacement. A highly detailed 3D map is essential for construction planning and management. However, the planetary surface imposes mapping restrictions due to rugged and homogeneous terrains. Additionally, changes in illumination conditions cause the mapping result (or 3D point-cloud map) to have inconsistent color properties that hamper the understanding of the topographic properties of a worksite. Therefore, this paper proposes a robotic construction mapping approach robust to illumination-variant environments. The proposed approach leverages a deep learning-based low-light image enhancement (LLIE) method to improve the mapping capabilities of the visual simultaneous localization and mapping (SLAM)-based robotic mapping method. In the experiment, the robotic mapping system in the emulated planetary worksite collected terrain images during the daytime from noon to late afternoon. Two sets of point-cloud maps, which were created from original and enhanced terrain images, were examined for comparison purposes. The experiment results showed that the LLIE method in the robotic mapping method significantly enhanced the brightness, preserving the inherent colors of the original terrain images. The visibility and the overall accuracy of the point-cloud map were consequently increased.https://www.mdpi.com/2072-4292/14/4/1027planetary constructionrobotic mappingSLAMlow-light enhancement3D point-cloud mapdeep learning
spellingShingle Sungchul Hong
Pranjay Shyam
Antyanta Bangunharcana
Hyuseoung Shin
Robotic Mapping Approach under Illumination-Variant Environments at Planetary Construction Sites
Remote Sensing
planetary construction
robotic mapping
SLAM
low-light enhancement
3D point-cloud map
deep learning
title Robotic Mapping Approach under Illumination-Variant Environments at Planetary Construction Sites
title_full Robotic Mapping Approach under Illumination-Variant Environments at Planetary Construction Sites
title_fullStr Robotic Mapping Approach under Illumination-Variant Environments at Planetary Construction Sites
title_full_unstemmed Robotic Mapping Approach under Illumination-Variant Environments at Planetary Construction Sites
title_short Robotic Mapping Approach under Illumination-Variant Environments at Planetary Construction Sites
title_sort robotic mapping approach under illumination variant environments at planetary construction sites
topic planetary construction
robotic mapping
SLAM
low-light enhancement
3D point-cloud map
deep learning
url https://www.mdpi.com/2072-4292/14/4/1027
work_keys_str_mv AT sungchulhong roboticmappingapproachunderilluminationvariantenvironmentsatplanetaryconstructionsites
AT pranjayshyam roboticmappingapproachunderilluminationvariantenvironmentsatplanetaryconstructionsites
AT antyantabangunharcana roboticmappingapproachunderilluminationvariantenvironmentsatplanetaryconstructionsites
AT hyuseoungshin roboticmappingapproachunderilluminationvariantenvironmentsatplanetaryconstructionsites