Research on unmanned vehicle obstacle avoidance technology based on LIDAR and depth camera fusion

To address the problems of poor accuracy of traditional EKF algorithm in estimating the position of unmanned vehicles and the deficiencies in accuracy and map completeness of the traditional map building method with single-line LiDAR, this paper proposes a method to create fused raster maps realized...

Full description

Bibliographic Details
Main Authors: Qiu Hao, Chen Weifeng, Ji Aihong, Hu Kai
Format: Article
Language:English
Published: Sciendo 2024-01-01
Series:Applied Mathematics and Nonlinear Sciences
Subjects:
Online Access:https://doi.org/10.2478/amns.2023.2.00575
_version_ 1797340727810719744
author Qiu Hao
Chen Weifeng
Ji Aihong
Hu Kai
author_facet Qiu Hao
Chen Weifeng
Ji Aihong
Hu Kai
author_sort Qiu Hao
collection DOAJ
description To address the problems of poor accuracy of traditional EKF algorithm in estimating the position of unmanned vehicles and the deficiencies in accuracy and map completeness of the traditional map building method with single-line LiDAR, this paper proposes a method to create fused raster maps realized with multi-source data. Firstly, the combined data of the inertial measurement unit and wheel encoder are corrected by adding the positional information output from the visual odometer using the error-state SLAM algorithm, and the local raster constructed by LiDAR and depth camera is fused frame by frame using the idea of Bayesian estimation to finally generate the fused global map. Then, a four-wheeled mobile unmanned vehicle with a LiDAR sensor and depth camera is selected as the experimental object, and dynamic environment avoidance simulation experiments are conducted to draw conclusions. The simulation experiment results show that when γ = 5.99, the algorithm generates a new local target point pg2 (17.49, 13.49) and the corresponding getaway path and finally guides the unmanned vehicle to the specified target point, verifying that the method in this paper can achieve the avoidance capability of the unmanned vehicle in the process of getting away from the newly emerged obstacles. This study uses the scanned data of LiDAR for the estimation of the real-time position of the unmanned vehicle to realize obstacle avoidance and path planning of the unmanned vehicle.
first_indexed 2024-03-08T10:07:27Z
format Article
id doaj.art-1d777dc83f704d65957c5482a4970aa1
institution Directory Open Access Journal
issn 2444-8656
language English
last_indexed 2024-03-08T10:07:27Z
publishDate 2024-01-01
publisher Sciendo
record_format Article
series Applied Mathematics and Nonlinear Sciences
spelling doaj.art-1d777dc83f704d65957c5482a4970aa12024-01-29T08:52:34ZengSciendoApplied Mathematics and Nonlinear Sciences2444-86562024-01-019110.2478/amns.2023.2.00575Research on unmanned vehicle obstacle avoidance technology based on LIDAR and depth camera fusionQiu Hao0Chen Weifeng1Ji Aihong2Hu Kai31School of Automation, Nanjing University of Information Science & Technology, Nanjing, Jiangsu, 210044, China.1School of Automation, Nanjing University of Information Science & Technology, Nanjing, Jiangsu, 210044, China.3Lab of Locomotion Bioinspiration and Intelligent Robots, College of Mechanical & Electrical Engineering, Nanjing University of Aeronautics & Astronautics, Nanjing, Jiangsu, 210016, China.1School of Automation, Nanjing University of Information Science & Technology, Nanjing, Jiangsu, 210044, China.To address the problems of poor accuracy of traditional EKF algorithm in estimating the position of unmanned vehicles and the deficiencies in accuracy and map completeness of the traditional map building method with single-line LiDAR, this paper proposes a method to create fused raster maps realized with multi-source data. Firstly, the combined data of the inertial measurement unit and wheel encoder are corrected by adding the positional information output from the visual odometer using the error-state SLAM algorithm, and the local raster constructed by LiDAR and depth camera is fused frame by frame using the idea of Bayesian estimation to finally generate the fused global map. Then, a four-wheeled mobile unmanned vehicle with a LiDAR sensor and depth camera is selected as the experimental object, and dynamic environment avoidance simulation experiments are conducted to draw conclusions. The simulation experiment results show that when γ = 5.99, the algorithm generates a new local target point pg2 (17.49, 13.49) and the corresponding getaway path and finally guides the unmanned vehicle to the specified target point, verifying that the method in this paper can achieve the avoidance capability of the unmanned vehicle in the process of getting away from the newly emerged obstacles. This study uses the scanned data of LiDAR for the estimation of the real-time position of the unmanned vehicle to realize obstacle avoidance and path planning of the unmanned vehicle.https://doi.org/10.2478/amns.2023.2.00575lidardepth cameraraster map methodunmanned vehicleobstacle avoidance techniquepositional attitudebayesian estimation78-02
spellingShingle Qiu Hao
Chen Weifeng
Ji Aihong
Hu Kai
Research on unmanned vehicle obstacle avoidance technology based on LIDAR and depth camera fusion
Applied Mathematics and Nonlinear Sciences
lidar
depth camera
raster map method
unmanned vehicle
obstacle avoidance technique
positional attitude
bayesian estimation
78-02
title Research on unmanned vehicle obstacle avoidance technology based on LIDAR and depth camera fusion
title_full Research on unmanned vehicle obstacle avoidance technology based on LIDAR and depth camera fusion
title_fullStr Research on unmanned vehicle obstacle avoidance technology based on LIDAR and depth camera fusion
title_full_unstemmed Research on unmanned vehicle obstacle avoidance technology based on LIDAR and depth camera fusion
title_short Research on unmanned vehicle obstacle avoidance technology based on LIDAR and depth camera fusion
title_sort research on unmanned vehicle obstacle avoidance technology based on lidar and depth camera fusion
topic lidar
depth camera
raster map method
unmanned vehicle
obstacle avoidance technique
positional attitude
bayesian estimation
78-02
url https://doi.org/10.2478/amns.2023.2.00575
work_keys_str_mv AT qiuhao researchonunmannedvehicleobstacleavoidancetechnologybasedonlidaranddepthcamerafusion
AT chenweifeng researchonunmannedvehicleobstacleavoidancetechnologybasedonlidaranddepthcamerafusion
AT jiaihong researchonunmannedvehicleobstacleavoidancetechnologybasedonlidaranddepthcamerafusion
AT hukai researchonunmannedvehicleobstacleavoidancetechnologybasedonlidaranddepthcamerafusion