A CAMERA-LIDAR CALIBRATION METHOD ASSISTED BY INDOOR SPATIAL STRUCTURE

The calibration of the camera and LiDAR is one of the basis for the construction of the multi-sensor fusion mapping system. Planar features of walls and grounds in indoor environments provides effective constraints for multi-sensor calibration. In this paper, we proposed a new camera-LiDAR calibrati...

Full description

Bibliographic Details
Main Authors: C. Ye, Z. Kang, X. Guo
Format: Article
Language:English
Published: Copernicus Publications 2023-12-01
Series:The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Online Access:https://isprs-archives.copernicus.org/articles/XLVIII-1-W2-2023/693/2023/isprs-archives-XLVIII-1-W2-2023-693-2023.pdf
_version_ 1797392222341038080
author C. Ye
C. Ye
C. Ye
Z. Kang
Z. Kang
Z. Kang
X. Guo
X. Guo
X. Guo
author_facet C. Ye
C. Ye
C. Ye
Z. Kang
Z. Kang
Z. Kang
X. Guo
X. Guo
X. Guo
author_sort C. Ye
collection DOAJ
description The calibration of the camera and LiDAR is one of the basis for the construction of the multi-sensor fusion mapping system. Planar features of walls and grounds in indoor environments provides effective constraints for multi-sensor calibration. In this paper, we proposed a new camera-LiDAR calibration method with the constraint of indoor spatial structure. Using the image and point cloud data collected by sensors, visual odometry and LiDAR odometry can be constructed to calculate the transformation between sensors. Based on visual odometry and LiDAR odometry, structural parameters in indoor environment are extracted from images and point cloud to constrain rotation estimation between sensors. In the method proposed in this paper, lines are extracted from the images acquired by the camera and used to estimate and track vanishing points. The direction estimated with vanishing points is used as a global constraint to optimize the rotation parameter estimation of the camera. The fitted planes from the point cloud acquired by the LiDAR are used to compute a set of orthogonal normal vectors corresponding to the ground and wall surfaces, which are used as global constraints to optimize the rotation parameter estimation of the LiDAR. The calibration method proposed in this paper is targetless and only constrained by the indoor spatial structure. The result shows that the proposed indoor spatial structure constraint calibration method can calibrate LiDAR and camera without generating cumulative errors during the rotation estimation process.
first_indexed 2024-03-08T23:44:07Z
format Article
id doaj.art-d039392ae7ee451aa397dc013a24acc8
institution Directory Open Access Journal
issn 1682-1750
2194-9034
language English
last_indexed 2024-03-08T23:44:07Z
publishDate 2023-12-01
publisher Copernicus Publications
record_format Article
series The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
spelling doaj.art-d039392ae7ee451aa397dc013a24acc82023-12-14T02:09:23ZengCopernicus PublicationsThe International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences1682-17502194-90342023-12-01XLVIII-1-W2-202369369810.5194/isprs-archives-XLVIII-1-W2-2023-693-2023A CAMERA-LIDAR CALIBRATION METHOD ASSISTED BY INDOOR SPATIAL STRUCTUREC. Ye0C. Ye1C. Ye2Z. Kang3Z. Kang4Z. Kang5X. Guo6X. Guo7X. Guo8School of Land Science and Technology, China University of Geosciences, Xueyuan Road, Beijing, 100083, ChinaResearch Center of Lunar and Planetary Remote Sensing Exploration, China University of Geosciences (Beijing) , No. 29 Xueyuan Road, Haidian District, Beijing, ChinaSubcenter of International Cooperation and Research on Lunar and Planetary Exploration, Center of Space Exploration, Ministry of Education of The People’s Republic of China, No. 29 Xueyuan Road, Haidian District, Beijing 100083, ChinaSchool of Land Science and Technology, China University of Geosciences, Xueyuan Road, Beijing, 100083, ChinaResearch Center of Lunar and Planetary Remote Sensing Exploration, China University of Geosciences (Beijing) , No. 29 Xueyuan Road, Haidian District, Beijing, ChinaSubcenter of International Cooperation and Research on Lunar and Planetary Exploration, Center of Space Exploration, Ministry of Education of The People’s Republic of China, No. 29 Xueyuan Road, Haidian District, Beijing 100083, ChinaSchool of Land Science and Technology, China University of Geosciences, Xueyuan Road, Beijing, 100083, ChinaResearch Center of Lunar and Planetary Remote Sensing Exploration, China University of Geosciences (Beijing) , No. 29 Xueyuan Road, Haidian District, Beijing, ChinaSubcenter of International Cooperation and Research on Lunar and Planetary Exploration, Center of Space Exploration, Ministry of Education of The People’s Republic of China, No. 29 Xueyuan Road, Haidian District, Beijing 100083, ChinaThe calibration of the camera and LiDAR is one of the basis for the construction of the multi-sensor fusion mapping system. Planar features of walls and grounds in indoor environments provides effective constraints for multi-sensor calibration. In this paper, we proposed a new camera-LiDAR calibration method with the constraint of indoor spatial structure. Using the image and point cloud data collected by sensors, visual odometry and LiDAR odometry can be constructed to calculate the transformation between sensors. Based on visual odometry and LiDAR odometry, structural parameters in indoor environment are extracted from images and point cloud to constrain rotation estimation between sensors. In the method proposed in this paper, lines are extracted from the images acquired by the camera and used to estimate and track vanishing points. The direction estimated with vanishing points is used as a global constraint to optimize the rotation parameter estimation of the camera. The fitted planes from the point cloud acquired by the LiDAR are used to compute a set of orthogonal normal vectors corresponding to the ground and wall surfaces, which are used as global constraints to optimize the rotation parameter estimation of the LiDAR. The calibration method proposed in this paper is targetless and only constrained by the indoor spatial structure. The result shows that the proposed indoor spatial structure constraint calibration method can calibrate LiDAR and camera without generating cumulative errors during the rotation estimation process.https://isprs-archives.copernicus.org/articles/XLVIII-1-W2-2023/693/2023/isprs-archives-XLVIII-1-W2-2023-693-2023.pdf
spellingShingle C. Ye
C. Ye
C. Ye
Z. Kang
Z. Kang
Z. Kang
X. Guo
X. Guo
X. Guo
A CAMERA-LIDAR CALIBRATION METHOD ASSISTED BY INDOOR SPATIAL STRUCTURE
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
title A CAMERA-LIDAR CALIBRATION METHOD ASSISTED BY INDOOR SPATIAL STRUCTURE
title_full A CAMERA-LIDAR CALIBRATION METHOD ASSISTED BY INDOOR SPATIAL STRUCTURE
title_fullStr A CAMERA-LIDAR CALIBRATION METHOD ASSISTED BY INDOOR SPATIAL STRUCTURE
title_full_unstemmed A CAMERA-LIDAR CALIBRATION METHOD ASSISTED BY INDOOR SPATIAL STRUCTURE
title_short A CAMERA-LIDAR CALIBRATION METHOD ASSISTED BY INDOOR SPATIAL STRUCTURE
title_sort camera lidar calibration method assisted by indoor spatial structure
url https://isprs-archives.copernicus.org/articles/XLVIII-1-W2-2023/693/2023/isprs-archives-XLVIII-1-W2-2023-693-2023.pdf
work_keys_str_mv AT cye acameralidarcalibrationmethodassistedbyindoorspatialstructure
AT cye acameralidarcalibrationmethodassistedbyindoorspatialstructure
AT cye acameralidarcalibrationmethodassistedbyindoorspatialstructure
AT zkang acameralidarcalibrationmethodassistedbyindoorspatialstructure
AT zkang acameralidarcalibrationmethodassistedbyindoorspatialstructure
AT zkang acameralidarcalibrationmethodassistedbyindoorspatialstructure
AT xguo acameralidarcalibrationmethodassistedbyindoorspatialstructure
AT xguo acameralidarcalibrationmethodassistedbyindoorspatialstructure
AT xguo acameralidarcalibrationmethodassistedbyindoorspatialstructure
AT cye cameralidarcalibrationmethodassistedbyindoorspatialstructure
AT cye cameralidarcalibrationmethodassistedbyindoorspatialstructure
AT cye cameralidarcalibrationmethodassistedbyindoorspatialstructure
AT zkang cameralidarcalibrationmethodassistedbyindoorspatialstructure
AT zkang cameralidarcalibrationmethodassistedbyindoorspatialstructure
AT zkang cameralidarcalibrationmethodassistedbyindoorspatialstructure
AT xguo cameralidarcalibrationmethodassistedbyindoorspatialstructure
AT xguo cameralidarcalibrationmethodassistedbyindoorspatialstructure
AT xguo cameralidarcalibrationmethodassistedbyindoorspatialstructure