Optimizing electricity demand scheduling in microgrids using deep reinforcement learning for cost‐efficiency
Abstract Renewable energy sources (RES) are increasingly being developed and used to address the energy crisis and protect the environment. However, the large‐scale integration of wind and solar energy into the power grid is still challenging and limits the adoption of these new energy sources. Micr...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2023-06-01
|
Series: | IET Generation, Transmission & Distribution |
Subjects: | |
Online Access: | https://doi.org/10.1049/gtd2.12866 |
_version_ | 1797804212596244480 |
---|---|
author | Baoyin Xiong Yiguo Guo Liyang Zhang Jianbin Li Xiufeng Liu Long Cheng |
author_facet | Baoyin Xiong Yiguo Guo Liyang Zhang Jianbin Li Xiufeng Liu Long Cheng |
author_sort | Baoyin Xiong |
collection | DOAJ |
description | Abstract Renewable energy sources (RES) are increasingly being developed and used to address the energy crisis and protect the environment. However, the large‐scale integration of wind and solar energy into the power grid is still challenging and limits the adoption of these new energy sources. Microgrids (MGs) are small‐scale power generation and distribution systems that can effectively integrate renewable energy, electric loads, and energy storage systems (ESS). By using MGs, it is possible to consume renewable energy locally and reduce energy losses from long‐distance transmission. This paper proposes a deep reinforcement learning (DRL)‐based energy management system (EMS) called DRL‐MG to process and schedule energy purchase requests from customers in real‐time. Specifically, the aim of this paper is to enhance the quality of service (QoS) for customers and reduce their electricity costs by proposing an approach that utilizes a Deep Q‐learning Network (DQN) model. The experimental results indicate that the proposed method outperforms commonly used real‐time scheduling methods significantly. |
first_indexed | 2024-03-13T05:33:44Z |
format | Article |
id | doaj.art-af1f815a5096456cbd86cdff766f3dc1 |
institution | Directory Open Access Journal |
issn | 1751-8687 1751-8695 |
language | English |
last_indexed | 2024-03-13T05:33:44Z |
publishDate | 2023-06-01 |
publisher | Wiley |
record_format | Article |
series | IET Generation, Transmission & Distribution |
spelling | doaj.art-af1f815a5096456cbd86cdff766f3dc12023-06-14T14:45:14ZengWileyIET Generation, Transmission & Distribution1751-86871751-86952023-06-0117112535254410.1049/gtd2.12866Optimizing electricity demand scheduling in microgrids using deep reinforcement learning for cost‐efficiencyBaoyin Xiong0Yiguo Guo1Liyang Zhang2Jianbin Li3Xiufeng Liu4Long Cheng5School of Control and Computer Engineering North China Electric Power University Changping district Beijing ChinaEconomic & Technology Research Institute State Grid Shandong Electric Power Company Jinan city Shandong Province ChinaEconomic & Technology Research Institute State Grid Shandong Electric Power Company Jinan city Shandong Province ChinaSchool of Control and Computer Engineering North China Electric Power University Changping district Beijing ChinaDepartment of Technology, Management and Economics Technical University of Denmark Kgs. Lyngby DenmarkSchool of Control and Computer Engineering North China Electric Power University Changping district Beijing ChinaAbstract Renewable energy sources (RES) are increasingly being developed and used to address the energy crisis and protect the environment. However, the large‐scale integration of wind and solar energy into the power grid is still challenging and limits the adoption of these new energy sources. Microgrids (MGs) are small‐scale power generation and distribution systems that can effectively integrate renewable energy, electric loads, and energy storage systems (ESS). By using MGs, it is possible to consume renewable energy locally and reduce energy losses from long‐distance transmission. This paper proposes a deep reinforcement learning (DRL)‐based energy management system (EMS) called DRL‐MG to process and schedule energy purchase requests from customers in real‐time. Specifically, the aim of this paper is to enhance the quality of service (QoS) for customers and reduce their electricity costs by proposing an approach that utilizes a Deep Q‐learning Network (DQN) model. The experimental results indicate that the proposed method outperforms commonly used real‐time scheduling methods significantly.https://doi.org/10.1049/gtd2.12866distributed algorithmsdistributed controlelectricity supply industrypower control |
spellingShingle | Baoyin Xiong Yiguo Guo Liyang Zhang Jianbin Li Xiufeng Liu Long Cheng Optimizing electricity demand scheduling in microgrids using deep reinforcement learning for cost‐efficiency IET Generation, Transmission & Distribution distributed algorithms distributed control electricity supply industry power control |
title | Optimizing electricity demand scheduling in microgrids using deep reinforcement learning for cost‐efficiency |
title_full | Optimizing electricity demand scheduling in microgrids using deep reinforcement learning for cost‐efficiency |
title_fullStr | Optimizing electricity demand scheduling in microgrids using deep reinforcement learning for cost‐efficiency |
title_full_unstemmed | Optimizing electricity demand scheduling in microgrids using deep reinforcement learning for cost‐efficiency |
title_short | Optimizing electricity demand scheduling in microgrids using deep reinforcement learning for cost‐efficiency |
title_sort | optimizing electricity demand scheduling in microgrids using deep reinforcement learning for cost efficiency |
topic | distributed algorithms distributed control electricity supply industry power control |
url | https://doi.org/10.1049/gtd2.12866 |
work_keys_str_mv | AT baoyinxiong optimizingelectricitydemandschedulinginmicrogridsusingdeepreinforcementlearningforcostefficiency AT yiguoguo optimizingelectricitydemandschedulinginmicrogridsusingdeepreinforcementlearningforcostefficiency AT liyangzhang optimizingelectricitydemandschedulinginmicrogridsusingdeepreinforcementlearningforcostefficiency AT jianbinli optimizingelectricitydemandschedulinginmicrogridsusingdeepreinforcementlearningforcostefficiency AT xiufengliu optimizingelectricitydemandschedulinginmicrogridsusingdeepreinforcementlearningforcostefficiency AT longcheng optimizingelectricitydemandschedulinginmicrogridsusingdeepreinforcementlearningforcostefficiency |