An Enhanced Detector for Vulnerable Road Users Using Infrastructure-Sensors-Enabled Device

The precise and real-time detection of vulnerable road users (VRUs) using infrastructure-sensors-enabled devices is crucial for the advancement of intelligent traffic monitoring systems. To overcome the prevalent inefficiencies in VRU detection, this paper introduces an enhanced detector that utiliz...

Full description

Bibliographic Details
Main Authors: Jian Shi, Dongxian Sun, Minh Kieu, Baicang Guo, Ming Gao
Format: Article
Language:English
Published: MDPI AG 2023-12-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/24/1/59
Description
Summary:The precise and real-time detection of vulnerable road users (VRUs) using infrastructure-sensors-enabled devices is crucial for the advancement of intelligent traffic monitoring systems. To overcome the prevalent inefficiencies in VRU detection, this paper introduces an enhanced detector that utilizes a lightweight backbone network integrated with a parameterless attention mechanism. This integration significantly enhances the feature extraction capability for small targets within high-resolution images. Additionally, the design features a streamlined ‘neck’ and a dynamic detection head, both augmented with a pruning algorithm to reduce the model’s parameter count and ensure a compact architecture. In collaboration with the specialized engineering dataset De_VRU, the model was deployed on the Hisilicon_Hi3516DV300 platform, specifically designed for infrastructure units. Rigorous ablation studies, employing YOLOv7-tiny as the baseline, confirm the detector’s efficacy on the BDD100K and LLVIP datasets. The model not only achieved an improvement of over 12% in the mAP@50 metric but also realized a reduction in parameter count by more than 40%, and a 50% decrease in inference time. Visualization outcomes and a case study illustrate the detector’s proficiency in conducting real-time detection with high-resolution imagery, underscoring its practical applicability.
ISSN:1424-8220