Multi-sensor calibration for autonomous container prime mover
Nowadays, self-driving car is becoming increasingly popular, and its implementation in docks is also a trend. To realize intelligent operation, multi-sensor system is needed. In this dissertation, we propose a multi-sensor system on a car with various cameras and LiDARs to simulate the working of th...
Main Author: | |
---|---|
Other Authors: | |
Format: | Thesis-Master by Coursework |
Language: | English |
Published: |
Nanyang Technological University
2023
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/165016 |
_version_ | 1826111264156286976 |
---|---|
author | Zhai, Yue |
author2 | Xie Lihua |
author_facet | Xie Lihua Zhai, Yue |
author_sort | Zhai, Yue |
collection | NTU |
description | Nowadays, self-driving car is becoming increasingly popular, and its implementation in docks is also a trend. To realize intelligent operation, multi-sensor system is needed. In this dissertation, we propose a multi-sensor system on a car with various cameras and LiDARs to simulate the working of the autonomous container prime movers at the dock. To get a general understanding of the surrounding environment, sensor fusion plays a vital role. So, the dissertation mainly focuses on calibrating the whole multi-sensor system. It includes multi-camera calibration, RGB camera and LiDAR calibration. Both target-based and targetless methods are used. We compare and analyze their strengths and appropriate implementation scenarios. For the targetless method, the result is sometimes unstable, so we alleviate the problem by multi-scene calibration. In some cases, the sensors may not have a common field of view, so we propose to chain the transformation using an intermedium sensor. Also, the calibration of a blind-spot LiDAR and a camera is rarely done before, and we extend the generic target-based method to realize it. Qualitative analysis of the calibration result of the system is implemented, and the sensor fusion result shows that the obtained calibrated parameters are accurate. Finally, we compile a calibration tutorial and share our experiment sample dataset on GitHub for further research. The tutorial and dataset are available at https://github.com/ZyueRemi/Tutorial_Lidar_camera_calibration. |
first_indexed | 2024-10-01T02:47:49Z |
format | Thesis-Master by Coursework |
id | ntu-10356/165016 |
institution | Nanyang Technological University |
language | English |
last_indexed | 2024-10-01T02:47:49Z |
publishDate | 2023 |
publisher | Nanyang Technological University |
record_format | dspace |
spelling | ntu-10356/1650162023-07-04T16:15:16Z Multi-sensor calibration for autonomous container prime mover Zhai, Yue Xie Lihua School of Electrical and Electronic Engineering ELHXIE@ntu.edu.sg Engineering::Electrical and electronic engineering Nowadays, self-driving car is becoming increasingly popular, and its implementation in docks is also a trend. To realize intelligent operation, multi-sensor system is needed. In this dissertation, we propose a multi-sensor system on a car with various cameras and LiDARs to simulate the working of the autonomous container prime movers at the dock. To get a general understanding of the surrounding environment, sensor fusion plays a vital role. So, the dissertation mainly focuses on calibrating the whole multi-sensor system. It includes multi-camera calibration, RGB camera and LiDAR calibration. Both target-based and targetless methods are used. We compare and analyze their strengths and appropriate implementation scenarios. For the targetless method, the result is sometimes unstable, so we alleviate the problem by multi-scene calibration. In some cases, the sensors may not have a common field of view, so we propose to chain the transformation using an intermedium sensor. Also, the calibration of a blind-spot LiDAR and a camera is rarely done before, and we extend the generic target-based method to realize it. Qualitative analysis of the calibration result of the system is implemented, and the sensor fusion result shows that the obtained calibrated parameters are accurate. Finally, we compile a calibration tutorial and share our experiment sample dataset on GitHub for further research. The tutorial and dataset are available at https://github.com/ZyueRemi/Tutorial_Lidar_camera_calibration. Master of Science (Computer Control and Automation) 2023-03-08T00:38:59Z 2023-03-08T00:38:59Z 2023 Thesis-Master by Coursework Zhai, Y. (2023). Multi-sensor calibration for autonomous container prime mover. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/165016 https://hdl.handle.net/10356/165016 en application/pdf Nanyang Technological University |
spellingShingle | Engineering::Electrical and electronic engineering Zhai, Yue Multi-sensor calibration for autonomous container prime mover |
title | Multi-sensor calibration for autonomous container prime mover |
title_full | Multi-sensor calibration for autonomous container prime mover |
title_fullStr | Multi-sensor calibration for autonomous container prime mover |
title_full_unstemmed | Multi-sensor calibration for autonomous container prime mover |
title_short | Multi-sensor calibration for autonomous container prime mover |
title_sort | multi sensor calibration for autonomous container prime mover |
topic | Engineering::Electrical and electronic engineering |
url | https://hdl.handle.net/10356/165016 |
work_keys_str_mv | AT zhaiyue multisensorcalibrationforautonomouscontainerprimemover |