Decentralized Multi-Agent DQN-Based Resource Allocation for Heterogeneous Traffic in V2X Communications

Vehicle-to-everything (V2X) communication is a pivotal technology for advanced driving, encompassing autonomous driving and Intelligent Transportation Systems (ITS). Beyond direct vehicle-to-vehicle (V2V) communication, vehicle-to-infrastructure (V2I) communication via Road Side Unit (RSU) can play...

Full description

Bibliographic Details
Main Authors: Insung Lee, Duk Kyung Kim
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10379611/
_version_ 1827386976261636096
author Insung Lee
Duk Kyung Kim
author_facet Insung Lee
Duk Kyung Kim
author_sort Insung Lee
collection DOAJ
description Vehicle-to-everything (V2X) communication is a pivotal technology for advanced driving, encompassing autonomous driving and Intelligent Transportation Systems (ITS). Beyond direct vehicle-to-vehicle (V2V) communication, vehicle-to-infrastructure (V2I) communication via Road Side Unit (RSU) can play an important role for efficient traffic management and enhancement of advanced driving, providing surrounding vehicles with proper road information. To accommodate diverse V2X scenarios, heterogeneous traffic with varied objectives, formats, and sizes needs to be supported for V2X communication. We tackle the challenge of resource allocation for heterogeneous traffic in the RSU-deployed V2X communications, proposing a decentralized Multi-Agent Reinforcement Learning (MARL) based resource allocation scheme with limited shared resources. To reduce the model complexity, RSU is modeled as a collection of virtual agents with a small action space instead of a single agent selecting multiple resources at the same time. A weighted global reward is introduced to incorporate traffic heterogeneity efficiently. The performance is evaluated and compared with random, 5G NR mode 2, and optimal allocation schemes in terms of Packet Reception Ratio (PRR) and communication range. The proposed scheme nearly matches the performance of the optimal scheme and significantly outperforms the random allocation scheme in both underload and overload situations.
first_indexed 2024-03-08T15:54:16Z
format Article
id doaj.art-0651eea25dbf46dab559858e9607f255
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-03-08T15:54:16Z
publishDate 2024-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-0651eea25dbf46dab559858e9607f2552024-01-09T00:04:20ZengIEEEIEEE Access2169-35362024-01-01123070308410.1109/ACCESS.2023.334935010379611Decentralized Multi-Agent DQN-Based Resource Allocation for Heterogeneous Traffic in V2X CommunicationsInsung Lee0https://orcid.org/0000-0002-2548-2099Duk Kyung Kim1https://orcid.org/0000-0002-0161-013XDepartment of Information and Communication Engineering, Inha University, Incheon, South KoreaDepartment of Information and Communication Engineering, Inha University, Incheon, South KoreaVehicle-to-everything (V2X) communication is a pivotal technology for advanced driving, encompassing autonomous driving and Intelligent Transportation Systems (ITS). Beyond direct vehicle-to-vehicle (V2V) communication, vehicle-to-infrastructure (V2I) communication via Road Side Unit (RSU) can play an important role for efficient traffic management and enhancement of advanced driving, providing surrounding vehicles with proper road information. To accommodate diverse V2X scenarios, heterogeneous traffic with varied objectives, formats, and sizes needs to be supported for V2X communication. We tackle the challenge of resource allocation for heterogeneous traffic in the RSU-deployed V2X communications, proposing a decentralized Multi-Agent Reinforcement Learning (MARL) based resource allocation scheme with limited shared resources. To reduce the model complexity, RSU is modeled as a collection of virtual agents with a small action space instead of a single agent selecting multiple resources at the same time. A weighted global reward is introduced to incorporate traffic heterogeneity efficiently. The performance is evaluated and compared with random, 5G NR mode 2, and optimal allocation schemes in terms of Packet Reception Ratio (PRR) and communication range. The proposed scheme nearly matches the performance of the optimal scheme and significantly outperforms the random allocation scheme in both underload and overload situations.https://ieeexplore.ieee.org/document/10379611/Vehicle-to-everything (V2X)resource allocationheterogeneous trafficdecentralized multi-agent DQN
spellingShingle Insung Lee
Duk Kyung Kim
Decentralized Multi-Agent DQN-Based Resource Allocation for Heterogeneous Traffic in V2X Communications
IEEE Access
Vehicle-to-everything (V2X)
resource allocation
heterogeneous traffic
decentralized multi-agent DQN
title Decentralized Multi-Agent DQN-Based Resource Allocation for Heterogeneous Traffic in V2X Communications
title_full Decentralized Multi-Agent DQN-Based Resource Allocation for Heterogeneous Traffic in V2X Communications
title_fullStr Decentralized Multi-Agent DQN-Based Resource Allocation for Heterogeneous Traffic in V2X Communications
title_full_unstemmed Decentralized Multi-Agent DQN-Based Resource Allocation for Heterogeneous Traffic in V2X Communications
title_short Decentralized Multi-Agent DQN-Based Resource Allocation for Heterogeneous Traffic in V2X Communications
title_sort decentralized multi agent dqn based resource allocation for heterogeneous traffic in v2x communications
topic Vehicle-to-everything (V2X)
resource allocation
heterogeneous traffic
decentralized multi-agent DQN
url https://ieeexplore.ieee.org/document/10379611/
work_keys_str_mv AT insunglee decentralizedmultiagentdqnbasedresourceallocationforheterogeneoustrafficinv2xcommunications
AT dukkyungkim decentralizedmultiagentdqnbasedresourceallocationforheterogeneoustrafficinv2xcommunications