Research Progress on 3D Object Detection of LiDAR Point Cloud

3D object detection is a new research direction in recent years, and its main task is the location and recognization of targets in space. The existing methods for 3D object detection using monocular or binocular stereo vision are easily affected by object occlusion, viewpoint changing and scale chan...

Full description

Bibliographic Details
Main Author: ZHOU Yan, PU Lei, LIN Liangxi, LIU Xiangyu, ZENG Fanzhi, ZHOU Yuexia
Format: Article
Language:zho
Published: Journal of Computer Engineering and Applications Beijing Co., Ltd., Science Press 2022-12-01
Series:Jisuanji kexue yu tansuo
Subjects:
Online Access:http://fcst.ceaj.org/fileup/1673-9418/PDF/2206026.pdf
Description
Summary:3D object detection is a new research direction in recent years, and its main task is the location and recognization of targets in space. The existing methods for 3D object detection using monocular or binocular stereo vision are easily affected by object occlusion, viewpoint changing and scale changing in 3D scene, there will be problems such as poor detection accuracy and robustness. LiDAR point cloud can provide 3D scene information, so using deep learning method to complete 3D object detection based on LiDAR point cloud has become a research hotspot in the field of 3D vision. Aiming at the 3D object detection based on LiDAR point cloud, the relevant research in recent years is reviewed. Firstly, the 3D object detection methods based on LiDAR point cloud are divided into point cloud based, point cloud projection based, point cloud voxelization based and multi-modal fusion based 3D object detection methods according to the data form of network input, and the most representative methods in each category are described in detail. Then common datasets are introduced, and the performance of representative methods is evaluated, and the advantages and limitations of each method are discussed from several aspects. Finally, the shortcomings and difficulties are given, and the future development directions are also discussed and put forward.
ISSN:1673-9418