LTransformer: A Transformer-Based Framework for Task Offloading in Vehicular Edge Computing

Vehicular edge computing (VEC) is essential in vehicle applications such as traffic control and in-vehicle services. In the task offloading process of VEC, predictive-mode transmission based on deep learning is constrained by limited computational resources. Furthermore, the accuracy of deep learnin...

Full description

Bibliographic Details
Main Authors: Yichi Yang, Ruibin Yan, Yijun Gu
Format: Article
Language:English
Published: MDPI AG 2023-09-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/13/18/10232
Description
Summary:Vehicular edge computing (VEC) is essential in vehicle applications such as traffic control and in-vehicle services. In the task offloading process of VEC, predictive-mode transmission based on deep learning is constrained by limited computational resources. Furthermore, the accuracy of deep learning algorithms in VEC is compromised due to the lack of edge computing features in algorithms. To solve these problems, this paper proposes a task offloading optimization approach that enables edge servers to store deep learning models. Moreover, this paper proposes the LTransformer, a transformer-based framework that incorporates edge computing features. The framework consists of pre-training, an input module, an encoding–decoding module, and an output module. Compared with four sequential deep learning methods, namely a Recurrent Neural Network (RNN), Long Short-Term Memory (LSTM), a Gated Recurrent Unit (GRU), and the Transformer, the LTransformer achieves the highest accuracy, reaching 80.1% on the real dataset. In addition, the LTransformer achieves 0.008 s when predicting a single trajectory, fully satisfying the fundamental requirements of real-time prediction and enabling task offloading optimization.
ISSN:2076-3417