Batch differentiable pose refinement for in-the-wild camera/LiDAR extrinsic calibration
Accurate camera to LiDAR (Light Detection and Ranging) extrinsic calibration is important for robotic tasks carrying out tight sensor fusion — such as target tracking and odometry. Calibration is typically performed before deployment in controlled conditions using calibration targets, however, this...
Հիմնական հեղինակներ: | , |
---|---|
Ձևաչափ: | Conference item |
Լեզու: | English |
Հրապարակվել է: |
Journal of Machine Learning Research
2023
|
_version_ | 1826312602927497216 |
---|---|
author | Fu, LFT Fallon, M |
author_facet | Fu, LFT Fallon, M |
author_sort | Fu, LFT |
collection | OXFORD |
description | Accurate camera to LiDAR (Light Detection and Ranging) extrinsic calibration is important for robotic tasks carrying out tight sensor fusion — such as target tracking and odometry. Calibration is typically performed before deployment in controlled conditions using calibration targets, however, this limits scalability and subsequent recalibration. We propose a novel approach for target-free camera-LiDAR calibration using end-to-end direct alignment which doesn’t need calibration targets. Our batched formulation enhances sample efficiency during training and robustness at inference time. We present experimental results, on publicly available real-world data, demonstrating 1.6cm/0.07∘median accuracy when transferred to unseen sensors from held-out data sequences. We also show state-of-the-art zero-shot transfer to unseen cameras, LiDARs, and environments. |
first_indexed | 2024-04-09T03:55:28Z |
format | Conference item |
id | oxford-uuid:dfa9a795-a49c-4b70-88a9-b1f14de342a7 |
institution | University of Oxford |
language | English |
last_indexed | 2024-04-09T03:55:28Z |
publishDate | 2023 |
publisher | Journal of Machine Learning Research |
record_format | dspace |
spelling | oxford-uuid:dfa9a795-a49c-4b70-88a9-b1f14de342a72024-03-12T12:27:29ZBatch differentiable pose refinement for in-the-wild camera/LiDAR extrinsic calibrationConference itemhttp://purl.org/coar/resource_type/c_5794uuid:dfa9a795-a49c-4b70-88a9-b1f14de342a7EnglishSymplectic ElementsJournal of Machine Learning Research2023Fu, LFTFallon, MAccurate camera to LiDAR (Light Detection and Ranging) extrinsic calibration is important for robotic tasks carrying out tight sensor fusion — such as target tracking and odometry. Calibration is typically performed before deployment in controlled conditions using calibration targets, however, this limits scalability and subsequent recalibration. We propose a novel approach for target-free camera-LiDAR calibration using end-to-end direct alignment which doesn’t need calibration targets. Our batched formulation enhances sample efficiency during training and robustness at inference time. We present experimental results, on publicly available real-world data, demonstrating 1.6cm/0.07∘median accuracy when transferred to unseen sensors from held-out data sequences. We also show state-of-the-art zero-shot transfer to unseen cameras, LiDARs, and environments. |
spellingShingle | Fu, LFT Fallon, M Batch differentiable pose refinement for in-the-wild camera/LiDAR extrinsic calibration |
title | Batch differentiable pose refinement for in-the-wild camera/LiDAR extrinsic calibration |
title_full | Batch differentiable pose refinement for in-the-wild camera/LiDAR extrinsic calibration |
title_fullStr | Batch differentiable pose refinement for in-the-wild camera/LiDAR extrinsic calibration |
title_full_unstemmed | Batch differentiable pose refinement for in-the-wild camera/LiDAR extrinsic calibration |
title_short | Batch differentiable pose refinement for in-the-wild camera/LiDAR extrinsic calibration |
title_sort | batch differentiable pose refinement for in the wild camera lidar extrinsic calibration |
work_keys_str_mv | AT fulft batchdifferentiableposerefinementforinthewildcameralidarextrinsiccalibration AT fallonm batchdifferentiableposerefinementforinthewildcameralidarextrinsiccalibration |