LTransformer: A Transformer-Based Framework for Task Offloading in Vehicular Edge Computing
Vehicular edge computing (VEC) is essential in vehicle applications such as traffic control and in-vehicle services. In the task offloading process of VEC, predictive-mode transmission based on deep learning is constrained by limited computational resources. Furthermore, the accuracy of deep learnin...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-09-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/13/18/10232 |
_version_ | 1797581441019674624 |
---|---|
author | Yichi Yang Ruibin Yan Yijun Gu |
author_facet | Yichi Yang Ruibin Yan Yijun Gu |
author_sort | Yichi Yang |
collection | DOAJ |
description | Vehicular edge computing (VEC) is essential in vehicle applications such as traffic control and in-vehicle services. In the task offloading process of VEC, predictive-mode transmission based on deep learning is constrained by limited computational resources. Furthermore, the accuracy of deep learning algorithms in VEC is compromised due to the lack of edge computing features in algorithms. To solve these problems, this paper proposes a task offloading optimization approach that enables edge servers to store deep learning models. Moreover, this paper proposes the LTransformer, a transformer-based framework that incorporates edge computing features. The framework consists of pre-training, an input module, an encoding–decoding module, and an output module. Compared with four sequential deep learning methods, namely a Recurrent Neural Network (RNN), Long Short-Term Memory (LSTM), a Gated Recurrent Unit (GRU), and the Transformer, the LTransformer achieves the highest accuracy, reaching 80.1% on the real dataset. In addition, the LTransformer achieves 0.008 s when predicting a single trajectory, fully satisfying the fundamental requirements of real-time prediction and enabling task offloading optimization. |
first_indexed | 2024-03-10T23:04:54Z |
format | Article |
id | doaj.art-7047a6ca68e84d7294a5d9dfb9909e04 |
institution | Directory Open Access Journal |
issn | 2076-3417 |
language | English |
last_indexed | 2024-03-10T23:04:54Z |
publishDate | 2023-09-01 |
publisher | MDPI AG |
record_format | Article |
series | Applied Sciences |
spelling | doaj.art-7047a6ca68e84d7294a5d9dfb9909e042023-11-19T09:24:50ZengMDPI AGApplied Sciences2076-34172023-09-0113181023210.3390/app131810232LTransformer: A Transformer-Based Framework for Task Offloading in Vehicular Edge ComputingYichi Yang0Ruibin Yan1Yijun Gu2College of Information and Cyber Security, People’s Public Security University of China, Beijing 102600, ChinaCollege of Information and Cyber Security, People’s Public Security University of China, Beijing 102600, ChinaCollege of Information and Cyber Security, People’s Public Security University of China, Beijing 102600, ChinaVehicular edge computing (VEC) is essential in vehicle applications such as traffic control and in-vehicle services. In the task offloading process of VEC, predictive-mode transmission based on deep learning is constrained by limited computational resources. Furthermore, the accuracy of deep learning algorithms in VEC is compromised due to the lack of edge computing features in algorithms. To solve these problems, this paper proposes a task offloading optimization approach that enables edge servers to store deep learning models. Moreover, this paper proposes the LTransformer, a transformer-based framework that incorporates edge computing features. The framework consists of pre-training, an input module, an encoding–decoding module, and an output module. Compared with four sequential deep learning methods, namely a Recurrent Neural Network (RNN), Long Short-Term Memory (LSTM), a Gated Recurrent Unit (GRU), and the Transformer, the LTransformer achieves the highest accuracy, reaching 80.1% on the real dataset. In addition, the LTransformer achieves 0.008 s when predicting a single trajectory, fully satisfying the fundamental requirements of real-time prediction and enabling task offloading optimization.https://www.mdpi.com/2076-3417/13/18/10232edge computingtask offloadingtrajectory predictiondeep learning |
spellingShingle | Yichi Yang Ruibin Yan Yijun Gu LTransformer: A Transformer-Based Framework for Task Offloading in Vehicular Edge Computing Applied Sciences edge computing task offloading trajectory prediction deep learning |
title | LTransformer: A Transformer-Based Framework for Task Offloading in Vehicular Edge Computing |
title_full | LTransformer: A Transformer-Based Framework for Task Offloading in Vehicular Edge Computing |
title_fullStr | LTransformer: A Transformer-Based Framework for Task Offloading in Vehicular Edge Computing |
title_full_unstemmed | LTransformer: A Transformer-Based Framework for Task Offloading in Vehicular Edge Computing |
title_short | LTransformer: A Transformer-Based Framework for Task Offloading in Vehicular Edge Computing |
title_sort | ltransformer a transformer based framework for task offloading in vehicular edge computing |
topic | edge computing task offloading trajectory prediction deep learning |
url | https://www.mdpi.com/2076-3417/13/18/10232 |
work_keys_str_mv | AT yichiyang ltransformeratransformerbasedframeworkfortaskoffloadinginvehicularedgecomputing AT ruibinyan ltransformeratransformerbasedframeworkfortaskoffloadinginvehicularedgecomputing AT yijungu ltransformeratransformerbasedframeworkfortaskoffloadinginvehicularedgecomputing |