Deep reinforcement learning based research on low‐carbon scheduling with distribution network schedulable resources
Abstract Reducing carbon emissions is a crucial way to achieve the goal of green and sustainable development. To accomplish this goal, electric vehicles (EVs) are considered system‐schedulable energy storage devices, suppressing the negative impact of the randomness and fluctuation of renewable ener...
Main Authors: | , , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2023-05-01
|
Series: | IET Generation, Transmission & Distribution |
Subjects: | |
Online Access: | https://doi.org/10.1049/gtd2.12806 |
_version_ | 1797824413346824192 |
---|---|
author | Shi Chen Yihong Liu Zhengwei Guo Huan Luo Yi Zhou Yiwei Qiu Buxiang Zhou Tianlei Zang |
author_facet | Shi Chen Yihong Liu Zhengwei Guo Huan Luo Yi Zhou Yiwei Qiu Buxiang Zhou Tianlei Zang |
author_sort | Shi Chen |
collection | DOAJ |
description | Abstract Reducing carbon emissions is a crucial way to achieve the goal of green and sustainable development. To accomplish this goal, electric vehicles (EVs) are considered system‐schedulable energy storage devices, suppressing the negative impact of the randomness and fluctuation of renewable energy on the system's operation. In this paper, a coordination control strategy aimed at minimising the carbon emissions of a distribution network between EVs, energy storage devices, and static var compensators (SVCs) is proposed. A model‐free deep reinforcement learning (DRL)‐based approach is developed to learn the optimal control strategy with the constraint of avoiding system overload caused by random EV access. The twin‐delayed deep deterministic policy gradient (TD3) framework is applied to design the learning method. After the model learning is completed, the neural network can quickly generate a real‐time low‐carbon scheduling strategy according to the system operating situation. Finally, simulation on the IEEE 33‐bus system verifies the effectiveness and robustness of this method. On the premise of meeting the charging demand of electric vehicles, this method can optimise the system operation by controlling the charge‐discharge process of EVs, effectively absorbing the renewable energy in the system and reducing the carbon emissions of the system operation. |
first_indexed | 2024-03-13T10:38:36Z |
format | Article |
id | doaj.art-febe7d72813b4469b83ae99acb80cf00 |
institution | Directory Open Access Journal |
issn | 1751-8687 1751-8695 |
language | English |
last_indexed | 2024-03-13T10:38:36Z |
publishDate | 2023-05-01 |
publisher | Wiley |
record_format | Article |
series | IET Generation, Transmission & Distribution |
spelling | doaj.art-febe7d72813b4469b83ae99acb80cf002023-05-18T05:19:43ZengWileyIET Generation, Transmission & Distribution1751-86871751-86952023-05-0117102289230010.1049/gtd2.12806Deep reinforcement learning based research on low‐carbon scheduling with distribution network schedulable resourcesShi Chen0Yihong Liu1Zhengwei Guo2Huan Luo3Yi Zhou4Yiwei Qiu5Buxiang Zhou6Tianlei Zang7College of Electrical Engineering Sichuan University Chengdu People's Republic of ChinaCollege of Electrical Engineering Sichuan University Chengdu People's Republic of ChinaCollege of Electrical Engineering Sichuan University Chengdu People's Republic of ChinaCollege of Electrical Engineering Sichuan University Chengdu People's Republic of ChinaCollege of Electrical Engineering Sichuan University Chengdu People's Republic of ChinaCollege of Electrical Engineering Sichuan University Chengdu People's Republic of ChinaCollege of Electrical Engineering Sichuan University Chengdu People's Republic of ChinaCollege of Electrical Engineering Sichuan University Chengdu People's Republic of ChinaAbstract Reducing carbon emissions is a crucial way to achieve the goal of green and sustainable development. To accomplish this goal, electric vehicles (EVs) are considered system‐schedulable energy storage devices, suppressing the negative impact of the randomness and fluctuation of renewable energy on the system's operation. In this paper, a coordination control strategy aimed at minimising the carbon emissions of a distribution network between EVs, energy storage devices, and static var compensators (SVCs) is proposed. A model‐free deep reinforcement learning (DRL)‐based approach is developed to learn the optimal control strategy with the constraint of avoiding system overload caused by random EV access. The twin‐delayed deep deterministic policy gradient (TD3) framework is applied to design the learning method. After the model learning is completed, the neural network can quickly generate a real‐time low‐carbon scheduling strategy according to the system operating situation. Finally, simulation on the IEEE 33‐bus system verifies the effectiveness and robustness of this method. On the premise of meeting the charging demand of electric vehicles, this method can optimise the system operation by controlling the charge‐discharge process of EVs, effectively absorbing the renewable energy in the system and reducing the carbon emissions of the system operation.https://doi.org/10.1049/gtd2.12806artificial intelligenceelectric vehiclesenergy conservationenergy storage |
spellingShingle | Shi Chen Yihong Liu Zhengwei Guo Huan Luo Yi Zhou Yiwei Qiu Buxiang Zhou Tianlei Zang Deep reinforcement learning based research on low‐carbon scheduling with distribution network schedulable resources IET Generation, Transmission & Distribution artificial intelligence electric vehicles energy conservation energy storage |
title | Deep reinforcement learning based research on low‐carbon scheduling with distribution network schedulable resources |
title_full | Deep reinforcement learning based research on low‐carbon scheduling with distribution network schedulable resources |
title_fullStr | Deep reinforcement learning based research on low‐carbon scheduling with distribution network schedulable resources |
title_full_unstemmed | Deep reinforcement learning based research on low‐carbon scheduling with distribution network schedulable resources |
title_short | Deep reinforcement learning based research on low‐carbon scheduling with distribution network schedulable resources |
title_sort | deep reinforcement learning based research on low carbon scheduling with distribution network schedulable resources |
topic | artificial intelligence electric vehicles energy conservation energy storage |
url | https://doi.org/10.1049/gtd2.12806 |
work_keys_str_mv | AT shichen deepreinforcementlearningbasedresearchonlowcarbonschedulingwithdistributionnetworkschedulableresources AT yihongliu deepreinforcementlearningbasedresearchonlowcarbonschedulingwithdistributionnetworkschedulableresources AT zhengweiguo deepreinforcementlearningbasedresearchonlowcarbonschedulingwithdistributionnetworkschedulableresources AT huanluo deepreinforcementlearningbasedresearchonlowcarbonschedulingwithdistributionnetworkschedulableresources AT yizhou deepreinforcementlearningbasedresearchonlowcarbonschedulingwithdistributionnetworkschedulableresources AT yiweiqiu deepreinforcementlearningbasedresearchonlowcarbonschedulingwithdistributionnetworkschedulableresources AT buxiangzhou deepreinforcementlearningbasedresearchonlowcarbonschedulingwithdistributionnetworkschedulableresources AT tianleizang deepreinforcementlearningbasedresearchonlowcarbonschedulingwithdistributionnetworkschedulableresources |