SLAM Overview: From Single Sensor to Heterogeneous Fusion
After decades of development, LIDAR and visual SLAM technology has relatively matured and been widely used in the military and civil fields. SLAM technology enables the mobile robot to have the abilities of autonomous positioning and mapping, which allows the robot to move in indoor and outdoor scen...
Main Authors: | , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-11-01
|
Series: | Remote Sensing |
Subjects: | |
Online Access: | https://www.mdpi.com/2072-4292/14/23/6033 |
_version_ | 1827642557180411904 |
---|---|
author | Weifeng Chen Chengjun Zhou Guangtao Shang Xiyang Wang Zhenxiong Li Chonghui Xu Kai Hu |
author_facet | Weifeng Chen Chengjun Zhou Guangtao Shang Xiyang Wang Zhenxiong Li Chonghui Xu Kai Hu |
author_sort | Weifeng Chen |
collection | DOAJ |
description | After decades of development, LIDAR and visual SLAM technology has relatively matured and been widely used in the military and civil fields. SLAM technology enables the mobile robot to have the abilities of autonomous positioning and mapping, which allows the robot to move in indoor and outdoor scenes where GPS signals are scarce. However, SLAM technology relying only on a single sensor has its limitations. For example, LIDAR SLAM is not suitable for scenes with highly dynamic or sparse features, and visual SLAM has poor robustness in low-texture or dark scenes. However, through the fusion of the two technologies, they have great potential to learn from each other. Therefore, this paper predicts that SLAM technology combining LIDAR and visual sensors, as well as various other sensors, will be the mainstream direction in the future. This paper reviews the development history of SLAM technology, deeply analyzes the hardware information of LIDAR and cameras, and presents some classical open source algorithms and datasets. According to the algorithm adopted by the fusion sensor, the traditional multi-sensor fusion methods based on uncertainty, features, and novel deep learning are introduced in detail. The excellent performance of the multi-sensor fusion method in complex scenes is summarized, and the future development of multi-sensor fusion method is prospected. |
first_indexed | 2024-03-09T17:33:56Z |
format | Article |
id | doaj.art-8d13b6582ca64d79b447628e3352bdfa |
institution | Directory Open Access Journal |
issn | 2072-4292 |
language | English |
last_indexed | 2024-03-09T17:33:56Z |
publishDate | 2022-11-01 |
publisher | MDPI AG |
record_format | Article |
series | Remote Sensing |
spelling | doaj.art-8d13b6582ca64d79b447628e3352bdfa2023-11-24T12:04:36ZengMDPI AGRemote Sensing2072-42922022-11-011423603310.3390/rs14236033SLAM Overview: From Single Sensor to Heterogeneous FusionWeifeng Chen0Chengjun Zhou1Guangtao Shang2Xiyang Wang3Zhenxiong Li4Chonghui Xu5Kai Hu6College of Mechanical and Electronic Engineering, Quanzhou University of Information Engineering, Quanzhou 362000, ChinaSchool of Automation, Nanjing University of Information Science and Technology, Nanjing 210044, ChinaSchool of Automation, Nanjing University of Information Science and Technology, Nanjing 210044, ChinaSchool of Automation, Nanjing University of Information Science and Technology, Nanjing 210044, ChinaSchool of Automation, Nanjing University of Information Science and Technology, Nanjing 210044, ChinaSchool of Automation, Nanjing University of Information Science and Technology, Nanjing 210044, ChinaSchool of Automation, Nanjing University of Information Science and Technology, Nanjing 210044, ChinaAfter decades of development, LIDAR and visual SLAM technology has relatively matured and been widely used in the military and civil fields. SLAM technology enables the mobile robot to have the abilities of autonomous positioning and mapping, which allows the robot to move in indoor and outdoor scenes where GPS signals are scarce. However, SLAM technology relying only on a single sensor has its limitations. For example, LIDAR SLAM is not suitable for scenes with highly dynamic or sparse features, and visual SLAM has poor robustness in low-texture or dark scenes. However, through the fusion of the two technologies, they have great potential to learn from each other. Therefore, this paper predicts that SLAM technology combining LIDAR and visual sensors, as well as various other sensors, will be the mainstream direction in the future. This paper reviews the development history of SLAM technology, deeply analyzes the hardware information of LIDAR and cameras, and presents some classical open source algorithms and datasets. According to the algorithm adopted by the fusion sensor, the traditional multi-sensor fusion methods based on uncertainty, features, and novel deep learning are introduced in detail. The excellent performance of the multi-sensor fusion method in complex scenes is summarized, and the future development of multi-sensor fusion method is prospected.https://www.mdpi.com/2072-4292/14/23/6033SLAMLIDAR SLAMvisual SLAMmulti-sensor fusionmobile robot |
spellingShingle | Weifeng Chen Chengjun Zhou Guangtao Shang Xiyang Wang Zhenxiong Li Chonghui Xu Kai Hu SLAM Overview: From Single Sensor to Heterogeneous Fusion Remote Sensing SLAM LIDAR SLAM visual SLAM multi-sensor fusion mobile robot |
title | SLAM Overview: From Single Sensor to Heterogeneous Fusion |
title_full | SLAM Overview: From Single Sensor to Heterogeneous Fusion |
title_fullStr | SLAM Overview: From Single Sensor to Heterogeneous Fusion |
title_full_unstemmed | SLAM Overview: From Single Sensor to Heterogeneous Fusion |
title_short | SLAM Overview: From Single Sensor to Heterogeneous Fusion |
title_sort | slam overview from single sensor to heterogeneous fusion |
topic | SLAM LIDAR SLAM visual SLAM multi-sensor fusion mobile robot |
url | https://www.mdpi.com/2072-4292/14/23/6033 |
work_keys_str_mv | AT weifengchen slamoverviewfromsinglesensortoheterogeneousfusion AT chengjunzhou slamoverviewfromsinglesensortoheterogeneousfusion AT guangtaoshang slamoverviewfromsinglesensortoheterogeneousfusion AT xiyangwang slamoverviewfromsinglesensortoheterogeneousfusion AT zhenxiongli slamoverviewfromsinglesensortoheterogeneousfusion AT chonghuixu slamoverviewfromsinglesensortoheterogeneousfusion AT kaihu slamoverviewfromsinglesensortoheterogeneousfusion |