3D Environmental Perception Modeling in the Simulated Autonomous-Driving Systems

Self-driving vehicles require a number of tests to prevent fatal accidents and ensure their appropriate operation in the physical world. However, conducting vehicle tests on the road is difficult because such tests are expensive and labor intensive. In this study, we used an autonomous-driving simul...

Full description

Bibliographic Details
Main Authors: Chunmian Lin, Daxin Tian, Xuting Duan, Jianshan Zhou
Format: Article
Language:English
Published: Tsinghua University Press 2021-03-01
Series:Complex System Modeling and Simulation
Subjects:
Online Access:https://www.sciopen.com/article/10.23919/CSMS.2021.0004
Description
Summary:Self-driving vehicles require a number of tests to prevent fatal accidents and ensure their appropriate operation in the physical world. However, conducting vehicle tests on the road is difficult because such tests are expensive and labor intensive. In this study, we used an autonomous-driving simulator, and investigated the three-dimensional environmental perception problem of the simulated system. Using the open-source CARLA simulator, we generated a CarlaSim from unreal traffic scenarios, comprising 15 000 camera-LiDAR (Light Detection and Ranging) samples with annotations and calibration files. Then, we developed Multi-Sensor Fusion Perception (MSFP) model for consuming two-modal data and detecting objects in the scenes. Furthermore, we conducted experiments on the KITTI and CarlaSim datasets; the results demonstrated the effectiveness of our proposed methods in terms of perception accuracy, inference efficiency, and generalization performance. The results of this study will faciliate the future development of autonomous-driving simulated tests.
ISSN:2096-9929