Research on SLAM Algorithm of Mobile Robot Based on the Fusion of 2D LiDAR and Depth Camera
This paper proposes a new Simultaneous Localization and Mapping (SLAM) method on the basis of graph-based optimization through the combination of the Light Detection and Ranging (LiDAR), RGB-D camera, encoder and Inertial Measurement Unit (IMU). It can conduct joint positioning of four sensors by ta...
Main Authors: | Lili Mu, Pantao Yao, Yuchen Zheng, Kai Chen, Fangfang Wang, Nana Qi |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9178302/ |
Similar Items
-
3D Radiometric Mapping by Means of LiDAR SLAM and Thermal Camera Data Fusion
by: Davide De Pazzi, et al.
Published: (2022-11-01) -
Map Construction Based on LiDAR Vision Inertial Multi-Sensor Fusion
by: Chuanwei Zhang, et al.
Published: (2021-12-01) -
A LiDAR SLAM-Assisted Fusion Positioning Method for USVs
by: Wei Shen, et al.
Published: (2023-02-01) -
The New Method of Active SLAM for Mapping Using LiDAR
by: Michal Mihálik, et al.
Published: (2022-03-01) -
LiDAR-Based Sensor Fusion SLAM and Localization for Autonomous Driving Vehicles in Complex Scenarios
by: Kai Dai, et al.
Published: (2023-02-01)