3D Environmental Perception Modeling in the Simulated Autonomous-Driving Systems

Self-driving vehicles require a number of tests to prevent fatal accidents and ensure their appropriate operation in the physical world. However, conducting vehicle tests on the road is difficult because such tests are expensive and labor intensive. In this study, we used an autonomous-driving simul...

Full description

Bibliographic Details
Main Authors: Chunmian Lin, Daxin Tian, Xuting Duan, Jianshan Zhou
Format: Article
Language:English
Published: Tsinghua University Press 2021-03-01
Series:Complex System Modeling and Simulation
Subjects:
Online Access:https://www.sciopen.com/article/10.23919/CSMS.2021.0004
_version_ 1797986186637082624
author Chunmian Lin
Daxin Tian
Xuting Duan
Jianshan Zhou
author_facet Chunmian Lin
Daxin Tian
Xuting Duan
Jianshan Zhou
author_sort Chunmian Lin
collection DOAJ
description Self-driving vehicles require a number of tests to prevent fatal accidents and ensure their appropriate operation in the physical world. However, conducting vehicle tests on the road is difficult because such tests are expensive and labor intensive. In this study, we used an autonomous-driving simulator, and investigated the three-dimensional environmental perception problem of the simulated system. Using the open-source CARLA simulator, we generated a CarlaSim from unreal traffic scenarios, comprising 15 000 camera-LiDAR (Light Detection and Ranging) samples with annotations and calibration files. Then, we developed Multi-Sensor Fusion Perception (MSFP) model for consuming two-modal data and detecting objects in the scenes. Furthermore, we conducted experiments on the KITTI and CarlaSim datasets; the results demonstrated the effectiveness of our proposed methods in terms of perception accuracy, inference efficiency, and generalization performance. The results of this study will faciliate the future development of autonomous-driving simulated tests.
first_indexed 2024-04-11T07:30:04Z
format Article
id doaj.art-b479cf3e6f594c0daf6466a408d63606
institution Directory Open Access Journal
issn 2096-9929
language English
last_indexed 2024-04-11T07:30:04Z
publishDate 2021-03-01
publisher Tsinghua University Press
record_format Article
series Complex System Modeling and Simulation
spelling doaj.art-b479cf3e6f594c0daf6466a408d636062022-12-22T04:36:56ZengTsinghua University PressComplex System Modeling and Simulation2096-99292021-03-0111455410.23919/CSMS.2021.00043D Environmental Perception Modeling in the Simulated Autonomous-Driving SystemsChunmian Lin0Daxin Tian1Xuting Duan2Jianshan Zhou3<institution>School of Transportation Science and Engineering, Beihang University</institution>, <city>Beijing</city> <postal-code>100191</postal-code>, <country>China</country><institution>School of Transportation Science and Engineering, Beihang University</institution>, <city>Beijing</city> <postal-code>100191</postal-code>, <country>China</country><institution>School of Transportation Science and Engineering, Beihang University</institution>, <city>Beijing</city> <postal-code>100191</postal-code>, <country>China</country><institution>School of Transportation Science and Engineering, Beihang University</institution>, <city>Beijing</city> <postal-code>100191</postal-code>, <country>China</country>Self-driving vehicles require a number of tests to prevent fatal accidents and ensure their appropriate operation in the physical world. However, conducting vehicle tests on the road is difficult because such tests are expensive and labor intensive. In this study, we used an autonomous-driving simulator, and investigated the three-dimensional environmental perception problem of the simulated system. Using the open-source CARLA simulator, we generated a CarlaSim from unreal traffic scenarios, comprising 15 000 camera-LiDAR (Light Detection and Ranging) samples with annotations and calibration files. Then, we developed Multi-Sensor Fusion Perception (MSFP) model for consuming two-modal data and detecting objects in the scenes. Furthermore, we conducted experiments on the KITTI and CarlaSim datasets; the results demonstrated the effectiveness of our proposed methods in terms of perception accuracy, inference efficiency, and generalization performance. The results of this study will faciliate the future development of autonomous-driving simulated tests.https://www.sciopen.com/article/10.23919/CSMS.2021.0004autonomous-driving systemenvironmental perceptionsimulated testdeep-learning model
spellingShingle Chunmian Lin
Daxin Tian
Xuting Duan
Jianshan Zhou
3D Environmental Perception Modeling in the Simulated Autonomous-Driving Systems
Complex System Modeling and Simulation
autonomous-driving system
environmental perception
simulated test
deep-learning model
title 3D Environmental Perception Modeling in the Simulated Autonomous-Driving Systems
title_full 3D Environmental Perception Modeling in the Simulated Autonomous-Driving Systems
title_fullStr 3D Environmental Perception Modeling in the Simulated Autonomous-Driving Systems
title_full_unstemmed 3D Environmental Perception Modeling in the Simulated Autonomous-Driving Systems
title_short 3D Environmental Perception Modeling in the Simulated Autonomous-Driving Systems
title_sort 3d environmental perception modeling in the simulated autonomous driving systems
topic autonomous-driving system
environmental perception
simulated test
deep-learning model
url https://www.sciopen.com/article/10.23919/CSMS.2021.0004
work_keys_str_mv AT chunmianlin 3denvironmentalperceptionmodelinginthesimulatedautonomousdrivingsystems
AT daxintian 3denvironmentalperceptionmodelinginthesimulatedautonomousdrivingsystems
AT xutingduan 3denvironmentalperceptionmodelinginthesimulatedautonomousdrivingsystems
AT jianshanzhou 3denvironmentalperceptionmodelinginthesimulatedautonomousdrivingsystems