Extrinsic Calibration for LiDAR–Camera Systems Using Direct 3D–2D Correspondences

Recent advances in the fields of driverless cars, intelligent robots and remote-sensing measurement have shown that the use of LiDAR fused with cameras can provide more comprehensive and reliable sensing of surroundings. However, since it is difficult to extract features from sparse LiDAR data to cr...

Szczegółowa specyfikacja

Opis bibliograficzny
Główni autorzy: Hao Yi, Bo Liu, Bin Zhao, Enhai Liu
Format: Artykuł
Język:English
Wydane: MDPI AG 2022-11-01
Seria:Remote Sensing
Hasła przedmiotowe:
Dostęp online:https://www.mdpi.com/2072-4292/14/23/6082
_version_ 1827642550247227392
author Hao Yi
Bo Liu
Bin Zhao
Enhai Liu
author_facet Hao Yi
Bo Liu
Bin Zhao
Enhai Liu
author_sort Hao Yi
collection DOAJ
description Recent advances in the fields of driverless cars, intelligent robots and remote-sensing measurement have shown that the use of LiDAR fused with cameras can provide more comprehensive and reliable sensing of surroundings. However, since it is difficult to extract features from sparse LiDAR data to create 3D–2D correspondences, finding a method for accurate external calibration of all types of LiDAR with cameras has become a research hotspot. To solve this problem, this paper proposes a method to directly obtain the 3D–2D correspondences of LiDAR–camera systems to complete accurate calibration. In this method, a laser detector card is used as an auxiliary tool to directly obtain the correspondences between laser spots and image pixels, thus solving the problem of difficulty in extracting features from sparse LiDAR data. In addition, a two-stage framework from coarse to fine is designed in this paper, which not only can solve the perspective-n-point problem with observation errors, but also requires only four LiDAR data points and the corresponding pixel information for more accurate external calibration. Finally, extensive simulations and experimental results show that the effectiveness and accuracy of our method are better than existing methods.
first_indexed 2024-03-09T17:33:49Z
format Article
id doaj.art-6bf2088d8aff42eeb51e7a87e5d8c694
institution Directory Open Access Journal
issn 2072-4292
language English
last_indexed 2024-03-09T17:33:49Z
publishDate 2022-11-01
publisher MDPI AG
record_format Article
series Remote Sensing
spelling doaj.art-6bf2088d8aff42eeb51e7a87e5d8c6942023-11-24T12:05:29ZengMDPI AGRemote Sensing2072-42922022-11-011423608210.3390/rs14236082Extrinsic Calibration for LiDAR–Camera Systems Using Direct 3D–2D CorrespondencesHao Yi0Bo Liu1Bin Zhao2Enhai Liu3Key Laboratory of Science and Technology on Space Optoelectronic Precision Measurement, Chinese Academy of Sciences, Chengdu 610209, ChinaKey Laboratory of Science and Technology on Space Optoelectronic Precision Measurement, Chinese Academy of Sciences, Chengdu 610209, ChinaKey Laboratory of Science and Technology on Space Optoelectronic Precision Measurement, Chinese Academy of Sciences, Chengdu 610209, ChinaKey Laboratory of Science and Technology on Space Optoelectronic Precision Measurement, Chinese Academy of Sciences, Chengdu 610209, ChinaRecent advances in the fields of driverless cars, intelligent robots and remote-sensing measurement have shown that the use of LiDAR fused with cameras can provide more comprehensive and reliable sensing of surroundings. However, since it is difficult to extract features from sparse LiDAR data to create 3D–2D correspondences, finding a method for accurate external calibration of all types of LiDAR with cameras has become a research hotspot. To solve this problem, this paper proposes a method to directly obtain the 3D–2D correspondences of LiDAR–camera systems to complete accurate calibration. In this method, a laser detector card is used as an auxiliary tool to directly obtain the correspondences between laser spots and image pixels, thus solving the problem of difficulty in extracting features from sparse LiDAR data. In addition, a two-stage framework from coarse to fine is designed in this paper, which not only can solve the perspective-n-point problem with observation errors, but also requires only four LiDAR data points and the corresponding pixel information for more accurate external calibration. Finally, extensive simulations and experimental results show that the effectiveness and accuracy of our method are better than existing methods.https://www.mdpi.com/2072-4292/14/23/6082LiDAR–camera systemextrinsic calibrationperspective-n-point
spellingShingle Hao Yi
Bo Liu
Bin Zhao
Enhai Liu
Extrinsic Calibration for LiDAR–Camera Systems Using Direct 3D–2D Correspondences
Remote Sensing
LiDAR–camera system
extrinsic calibration
perspective-n-point
title Extrinsic Calibration for LiDAR–Camera Systems Using Direct 3D–2D Correspondences
title_full Extrinsic Calibration for LiDAR–Camera Systems Using Direct 3D–2D Correspondences
title_fullStr Extrinsic Calibration for LiDAR–Camera Systems Using Direct 3D–2D Correspondences
title_full_unstemmed Extrinsic Calibration for LiDAR–Camera Systems Using Direct 3D–2D Correspondences
title_short Extrinsic Calibration for LiDAR–Camera Systems Using Direct 3D–2D Correspondences
title_sort extrinsic calibration for lidar camera systems using direct 3d 2d correspondences
topic LiDAR–camera system
extrinsic calibration
perspective-n-point
url https://www.mdpi.com/2072-4292/14/23/6082
work_keys_str_mv AT haoyi extrinsiccalibrationforlidarcamerasystemsusingdirect3d2dcorrespondences
AT boliu extrinsiccalibrationforlidarcamerasystemsusingdirect3d2dcorrespondences
AT binzhao extrinsiccalibrationforlidarcamerasystemsusingdirect3d2dcorrespondences
AT enhailiu extrinsiccalibrationforlidarcamerasystemsusingdirect3d2dcorrespondences