LiDAR-Based Sensor Fusion SLAM and Localization for Autonomous Driving Vehicles in Complex Scenarios
LiDAR-based simultaneous localization and mapping (SLAM) and online localization methods are widely used in autonomous driving, and are key parts of intelligent vehicles. However, current SLAM algorithms have limitations in map drift and localization algorithms based on a single sensor have poor ada...
Main Authors: | , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-02-01
|
Series: | Journal of Imaging |
Subjects: | |
Online Access: | https://www.mdpi.com/2313-433X/9/2/52 |
_version_ | 1797620159935938560 |
---|---|
author | Kai Dai Bohua Sun Guanpu Wu Shuai Zhao Fangwu Ma Yufei Zhang Jian Wu |
author_facet | Kai Dai Bohua Sun Guanpu Wu Shuai Zhao Fangwu Ma Yufei Zhang Jian Wu |
author_sort | Kai Dai |
collection | DOAJ |
description | LiDAR-based simultaneous localization and mapping (SLAM) and online localization methods are widely used in autonomous driving, and are key parts of intelligent vehicles. However, current SLAM algorithms have limitations in map drift and localization algorithms based on a single sensor have poor adaptability to complex scenarios. A SLAM and online localization method based on multi-sensor fusion is proposed and integrated into a general framework in this paper. In the mapping process, constraints consisting of normal distributions transform (NDT) registration, loop closure detection and real time kinematic (RTK) global navigation satellite system (GNSS) position for the front-end and the pose graph optimization algorithm for the back-end, which are applied to achieve an optimized map without drift. In the localization process, the error state Kalman filter (ESKF) fuses LiDAR-based localization position and vehicle states to realize more robust and precise localization. The open-source KITTI dataset and field tests are used to test the proposed method. The method effectiveness shown in the test results achieves 5–10 cm mapping accuracy and 20–30 cm localization accuracy, and it realizes online autonomous driving in complex scenarios. |
first_indexed | 2024-03-11T08:36:53Z |
format | Article |
id | doaj.art-dc769e48b1684d9db8c07f7a2c6c7e10 |
institution | Directory Open Access Journal |
issn | 2313-433X |
language | English |
last_indexed | 2024-03-11T08:36:53Z |
publishDate | 2023-02-01 |
publisher | MDPI AG |
record_format | Article |
series | Journal of Imaging |
spelling | doaj.art-dc769e48b1684d9db8c07f7a2c6c7e102023-11-16T21:25:21ZengMDPI AGJournal of Imaging2313-433X2023-02-01925210.3390/jimaging9020052LiDAR-Based Sensor Fusion SLAM and Localization for Autonomous Driving Vehicles in Complex ScenariosKai Dai0Bohua Sun1Guanpu Wu2Shuai Zhao3Fangwu Ma4Yufei Zhang5Jian Wu6State Key Laboratory of Automotive Simulation and Control, Jilin University, Changchun 130025, ChinaState Key Laboratory of Automotive Simulation and Control, Jilin University, Changchun 130025, ChinaState Key Laboratory of Automotive Simulation and Control, Jilin University, Changchun 130025, ChinaAutomotive Data Center, CATARC, Tianjin 300000, ChinaState Key Laboratory of Automotive Simulation and Control, Jilin University, Changchun 130025, ChinaState Key Laboratory of Automotive Simulation and Control, Jilin University, Changchun 130025, ChinaState Key Laboratory of Automotive Simulation and Control, Jilin University, Changchun 130025, ChinaLiDAR-based simultaneous localization and mapping (SLAM) and online localization methods are widely used in autonomous driving, and are key parts of intelligent vehicles. However, current SLAM algorithms have limitations in map drift and localization algorithms based on a single sensor have poor adaptability to complex scenarios. A SLAM and online localization method based on multi-sensor fusion is proposed and integrated into a general framework in this paper. In the mapping process, constraints consisting of normal distributions transform (NDT) registration, loop closure detection and real time kinematic (RTK) global navigation satellite system (GNSS) position for the front-end and the pose graph optimization algorithm for the back-end, which are applied to achieve an optimized map without drift. In the localization process, the error state Kalman filter (ESKF) fuses LiDAR-based localization position and vehicle states to realize more robust and precise localization. The open-source KITTI dataset and field tests are used to test the proposed method. The method effectiveness shown in the test results achieves 5–10 cm mapping accuracy and 20–30 cm localization accuracy, and it realizes online autonomous driving in complex scenarios.https://www.mdpi.com/2313-433X/9/2/52LiDAR SLAMautonomous vehiclelocalizationmulti-sensor fusion |
spellingShingle | Kai Dai Bohua Sun Guanpu Wu Shuai Zhao Fangwu Ma Yufei Zhang Jian Wu LiDAR-Based Sensor Fusion SLAM and Localization for Autonomous Driving Vehicles in Complex Scenarios Journal of Imaging LiDAR SLAM autonomous vehicle localization multi-sensor fusion |
title | LiDAR-Based Sensor Fusion SLAM and Localization for Autonomous Driving Vehicles in Complex Scenarios |
title_full | LiDAR-Based Sensor Fusion SLAM and Localization for Autonomous Driving Vehicles in Complex Scenarios |
title_fullStr | LiDAR-Based Sensor Fusion SLAM and Localization for Autonomous Driving Vehicles in Complex Scenarios |
title_full_unstemmed | LiDAR-Based Sensor Fusion SLAM and Localization for Autonomous Driving Vehicles in Complex Scenarios |
title_short | LiDAR-Based Sensor Fusion SLAM and Localization for Autonomous Driving Vehicles in Complex Scenarios |
title_sort | lidar based sensor fusion slam and localization for autonomous driving vehicles in complex scenarios |
topic | LiDAR SLAM autonomous vehicle localization multi-sensor fusion |
url | https://www.mdpi.com/2313-433X/9/2/52 |
work_keys_str_mv | AT kaidai lidarbasedsensorfusionslamandlocalizationforautonomousdrivingvehiclesincomplexscenarios AT bohuasun lidarbasedsensorfusionslamandlocalizationforautonomousdrivingvehiclesincomplexscenarios AT guanpuwu lidarbasedsensorfusionslamandlocalizationforautonomousdrivingvehiclesincomplexscenarios AT shuaizhao lidarbasedsensorfusionslamandlocalizationforautonomousdrivingvehiclesincomplexscenarios AT fangwuma lidarbasedsensorfusionslamandlocalizationforautonomousdrivingvehiclesincomplexscenarios AT yufeizhang lidarbasedsensorfusionslamandlocalizationforautonomousdrivingvehiclesincomplexscenarios AT jianwu lidarbasedsensorfusionslamandlocalizationforautonomousdrivingvehiclesincomplexscenarios |