Reinforcement learning based task scheduling for environmentally sustainable federated cloud computing

Abstract The significant energy consumption within data centers is an essential contributor to global energy consumption and carbon emissions. Therefore, reducing energy consumption and carbon emissions in data centers plays a crucial role in sustainable development. Traditional cloud computing has...

Full description

Bibliographic Details
Main Authors: Zhibao Wang, Shuaijun Chen, Lu Bai, Juntao Gao, Jinhua Tao, Raymond R. Bond, Maurice D. Mulvenna
Format: Article
Language:English
Published: SpringerOpen 2023-12-01
Series:Journal of Cloud Computing: Advances, Systems and Applications
Subjects:
Online Access:https://doi.org/10.1186/s13677-023-00553-0
_version_ 1827590321351950336
author Zhibao Wang
Shuaijun Chen
Lu Bai
Juntao Gao
Jinhua Tao
Raymond R. Bond
Maurice D. Mulvenna
author_facet Zhibao Wang
Shuaijun Chen
Lu Bai
Juntao Gao
Jinhua Tao
Raymond R. Bond
Maurice D. Mulvenna
author_sort Zhibao Wang
collection DOAJ
description Abstract The significant energy consumption within data centers is an essential contributor to global energy consumption and carbon emissions. Therefore, reducing energy consumption and carbon emissions in data centers plays a crucial role in sustainable development. Traditional cloud computing has reached a bottleneck, primarily due to high energy consumption. The emerging federated cloud approach can reduce the energy consumption and carbon emissions of cloud data centers by leveraging the geographical differences of multiple cloud data centers in a federated cloud. In this paper, we propose Eco-friendly Reinforcement Learning in Federated Cloud (ERLFC), a framework that uses reinforcement learning for task scheduling in a federated cloud environment. ERLFC aims to intelligently consider the state of each data center and effectively harness the variations in energy and carbon emission ratios across geographically distributed cloud data centers in the federated cloud. We build ERLFC using Actor-Critic algorithm, which select the appropriate data center to assign a task based on various factors such as energy consumption, cooling method, waiting time of the task, energy type, emission ratio, and total energy consumption of the current cloud data center and the details of the next task. To demonstrate the effectiveness of ERLFC, we conducted simulations based on real-world task execution data, and the results show that ERLFC can effectively reduce energy consumption and emissions during task execution. In comparison to Round Robin, Random, SO, and GJO algorithms, ERLFC achieves respective reductions of 1.09, 1.08, 1.21, and 1.26 times in terms of energy saving and emission reduction.
first_indexed 2024-03-09T01:15:14Z
format Article
id doaj.art-02d992fc9ade4897b2f7c8024a3d0550
institution Directory Open Access Journal
issn 2192-113X
language English
last_indexed 2024-03-09T01:15:14Z
publishDate 2023-12-01
publisher SpringerOpen
record_format Article
series Journal of Cloud Computing: Advances, Systems and Applications
spelling doaj.art-02d992fc9ade4897b2f7c8024a3d05502023-12-10T12:31:49ZengSpringerOpenJournal of Cloud Computing: Advances, Systems and Applications2192-113X2023-12-0112111710.1186/s13677-023-00553-0Reinforcement learning based task scheduling for environmentally sustainable federated cloud computingZhibao Wang0Shuaijun Chen1Lu Bai2Juntao Gao3Jinhua Tao4Raymond R. Bond5Maurice D. Mulvenna6School of Computer and Information Technology, Northeast Petroleum UniversitySchool of Computer and Information Technology, Northeast Petroleum UniversitySchool of Electronics, Electrical Engineering and Computer Science, Queen’s University BelfastSchool of Computer and Information Technology, Northeast Petroleum UniversityState Key Laboratory of Remote Sensing Science, Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing Normal UniversitySchool of Computing, Ulster UniversitySchool of Computing, Ulster UniversityAbstract The significant energy consumption within data centers is an essential contributor to global energy consumption and carbon emissions. Therefore, reducing energy consumption and carbon emissions in data centers plays a crucial role in sustainable development. Traditional cloud computing has reached a bottleneck, primarily due to high energy consumption. The emerging federated cloud approach can reduce the energy consumption and carbon emissions of cloud data centers by leveraging the geographical differences of multiple cloud data centers in a federated cloud. In this paper, we propose Eco-friendly Reinforcement Learning in Federated Cloud (ERLFC), a framework that uses reinforcement learning for task scheduling in a federated cloud environment. ERLFC aims to intelligently consider the state of each data center and effectively harness the variations in energy and carbon emission ratios across geographically distributed cloud data centers in the federated cloud. We build ERLFC using Actor-Critic algorithm, which select the appropriate data center to assign a task based on various factors such as energy consumption, cooling method, waiting time of the task, energy type, emission ratio, and total energy consumption of the current cloud data center and the details of the next task. To demonstrate the effectiveness of ERLFC, we conducted simulations based on real-world task execution data, and the results show that ERLFC can effectively reduce energy consumption and emissions during task execution. In comparison to Round Robin, Random, SO, and GJO algorithms, ERLFC achieves respective reductions of 1.09, 1.08, 1.21, and 1.26 times in terms of energy saving and emission reduction.https://doi.org/10.1186/s13677-023-00553-0Cloud computingFederated cloudReinforcement learningEnergy efficiencyCarbon emissionsTask scheduling
spellingShingle Zhibao Wang
Shuaijun Chen
Lu Bai
Juntao Gao
Jinhua Tao
Raymond R. Bond
Maurice D. Mulvenna
Reinforcement learning based task scheduling for environmentally sustainable federated cloud computing
Journal of Cloud Computing: Advances, Systems and Applications
Cloud computing
Federated cloud
Reinforcement learning
Energy efficiency
Carbon emissions
Task scheduling
title Reinforcement learning based task scheduling for environmentally sustainable federated cloud computing
title_full Reinforcement learning based task scheduling for environmentally sustainable federated cloud computing
title_fullStr Reinforcement learning based task scheduling for environmentally sustainable federated cloud computing
title_full_unstemmed Reinforcement learning based task scheduling for environmentally sustainable federated cloud computing
title_short Reinforcement learning based task scheduling for environmentally sustainable federated cloud computing
title_sort reinforcement learning based task scheduling for environmentally sustainable federated cloud computing
topic Cloud computing
Federated cloud
Reinforcement learning
Energy efficiency
Carbon emissions
Task scheduling
url https://doi.org/10.1186/s13677-023-00553-0
work_keys_str_mv AT zhibaowang reinforcementlearningbasedtaskschedulingforenvironmentallysustainablefederatedcloudcomputing
AT shuaijunchen reinforcementlearningbasedtaskschedulingforenvironmentallysustainablefederatedcloudcomputing
AT lubai reinforcementlearningbasedtaskschedulingforenvironmentallysustainablefederatedcloudcomputing
AT juntaogao reinforcementlearningbasedtaskschedulingforenvironmentallysustainablefederatedcloudcomputing
AT jinhuatao reinforcementlearningbasedtaskschedulingforenvironmentallysustainablefederatedcloudcomputing
AT raymondrbond reinforcementlearningbasedtaskschedulingforenvironmentallysustainablefederatedcloudcomputing
AT mauricedmulvenna reinforcementlearningbasedtaskschedulingforenvironmentallysustainablefederatedcloudcomputing