Optimization of a Cluster-Based Energy Management System Using Deep Reinforcement Learning Without Affecting Prosumer Comfort: V2X Technologies and Peer-to-Peer Energy Trading

The concept of Prosumer has enabled consumers to actively participate in Peer-to-Peer (P2P) energy trading, particularly as Renewable Energy Source (RES)s and Electric Vehicle (EV)s have become more accessible and cost-effective. In addition to the P2P energy trading, prosumers benefit from the rela...

Full description

Bibliographic Details
Main Authors: Mete Yavuz, Omer Cihan Kivanc
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10452350/
_version_ 1797272663285039104
author Mete Yavuz
Omer Cihan Kivanc
author_facet Mete Yavuz
Omer Cihan Kivanc
author_sort Mete Yavuz
collection DOAJ
description The concept of Prosumer has enabled consumers to actively participate in Peer-to-Peer (P2P) energy trading, particularly as Renewable Energy Source (RES)s and Electric Vehicle (EV)s have become more accessible and cost-effective. In addition to the P2P energy trading, prosumers benefit from the relatively high energy capacity of EVs through the integration of Vehicle-to-X (V2X) technologies, such as Vehicle-to-Home (V2H), Vehicle-to-Load (V2L), and Vehicle-to-Grid (V2G). Optimization of an Energy Management System (EMS) is required to allocate the required energy efficiently within the cluster, due to the complex pricing and energy exchange mechanism of P2P energy trading and multiple EVs with V2X technologies. In this paper, Deep Reinforcement Learning (DRL) based EMS optimization method is proposed to optimize the pricing and energy exchanging mechanisms of the P2P energy trading without affecting the comfort of prosumers. The proposed EMS is applied to a small-scale cluster-based environment, including multiple <xref rid="deqn6" ref-type="disp-formula">(6)</xref> prosumers, P2P energy trading with novel hybrid pricing and energy exchanging mechanisms, and V2X technologies (V2H, V2L, and V2G) to reduce the overall energy costs and increase the Self-Sufficiency Ratio (SSR)s. Multi Double Deep Q-Network (DDQN) agents based DRL algorithm is implemented and the environment is formulated as a Markov Decision Process (MDP) to optimize the decision-making process. Numerical results show that the proposed EMS reduces the overall energy costs by 19.18&#x0025;, increases the SSRs by 9.39&#x0025;, and achieves an overall 65.87&#x0025; SSR. Additionally, numerical results indicates that model-free DRL, such as DDQN agent based Deep Q-Network (DQN) Reinforcement Learning (RL) algorithm, promise to eliminate the energy management complexities with multiple uncertainties.
first_indexed 2024-03-07T14:31:37Z
format Article
id doaj.art-06fc2fd395ed4f5180886a6aa705b16e
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-03-07T14:31:37Z
publishDate 2024-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-06fc2fd395ed4f5180886a6aa705b16e2024-03-06T00:01:29ZengIEEEIEEE Access2169-35362024-01-0112315513157510.1109/ACCESS.2024.337092210452350Optimization of a Cluster-Based Energy Management System Using Deep Reinforcement Learning Without Affecting Prosumer Comfort: V2X Technologies and Peer-to-Peer Energy TradingMete Yavuz0https://orcid.org/0000-0001-9153-1605Omer Cihan Kivanc1Department of Electrical and Electronics Engineering, Istanbul Okan University, &#x0130;stanbul, TurkeyDepartment of Electrical and Electronics Engineering, Istanbul Okan University, &#x0130;stanbul, TurkeyThe concept of Prosumer has enabled consumers to actively participate in Peer-to-Peer (P2P) energy trading, particularly as Renewable Energy Source (RES)s and Electric Vehicle (EV)s have become more accessible and cost-effective. In addition to the P2P energy trading, prosumers benefit from the relatively high energy capacity of EVs through the integration of Vehicle-to-X (V2X) technologies, such as Vehicle-to-Home (V2H), Vehicle-to-Load (V2L), and Vehicle-to-Grid (V2G). Optimization of an Energy Management System (EMS) is required to allocate the required energy efficiently within the cluster, due to the complex pricing and energy exchange mechanism of P2P energy trading and multiple EVs with V2X technologies. In this paper, Deep Reinforcement Learning (DRL) based EMS optimization method is proposed to optimize the pricing and energy exchanging mechanisms of the P2P energy trading without affecting the comfort of prosumers. The proposed EMS is applied to a small-scale cluster-based environment, including multiple <xref rid="deqn6" ref-type="disp-formula">(6)</xref> prosumers, P2P energy trading with novel hybrid pricing and energy exchanging mechanisms, and V2X technologies (V2H, V2L, and V2G) to reduce the overall energy costs and increase the Self-Sufficiency Ratio (SSR)s. Multi Double Deep Q-Network (DDQN) agents based DRL algorithm is implemented and the environment is formulated as a Markov Decision Process (MDP) to optimize the decision-making process. Numerical results show that the proposed EMS reduces the overall energy costs by 19.18&#x0025;, increases the SSRs by 9.39&#x0025;, and achieves an overall 65.87&#x0025; SSR. Additionally, numerical results indicates that model-free DRL, such as DDQN agent based Deep Q-Network (DQN) Reinforcement Learning (RL) algorithm, promise to eliminate the energy management complexities with multiple uncertainties.https://ieeexplore.ieee.org/document/10452350/Energy management systempeer-to-peer energy tradingvehicle-to-homemulti-agent reinforcement learningdeep reinforcement learningsmart grids
spellingShingle Mete Yavuz
Omer Cihan Kivanc
Optimization of a Cluster-Based Energy Management System Using Deep Reinforcement Learning Without Affecting Prosumer Comfort: V2X Technologies and Peer-to-Peer Energy Trading
IEEE Access
Energy management system
peer-to-peer energy trading
vehicle-to-home
multi-agent reinforcement learning
deep reinforcement learning
smart grids
title Optimization of a Cluster-Based Energy Management System Using Deep Reinforcement Learning Without Affecting Prosumer Comfort: V2X Technologies and Peer-to-Peer Energy Trading
title_full Optimization of a Cluster-Based Energy Management System Using Deep Reinforcement Learning Without Affecting Prosumer Comfort: V2X Technologies and Peer-to-Peer Energy Trading
title_fullStr Optimization of a Cluster-Based Energy Management System Using Deep Reinforcement Learning Without Affecting Prosumer Comfort: V2X Technologies and Peer-to-Peer Energy Trading
title_full_unstemmed Optimization of a Cluster-Based Energy Management System Using Deep Reinforcement Learning Without Affecting Prosumer Comfort: V2X Technologies and Peer-to-Peer Energy Trading
title_short Optimization of a Cluster-Based Energy Management System Using Deep Reinforcement Learning Without Affecting Prosumer Comfort: V2X Technologies and Peer-to-Peer Energy Trading
title_sort optimization of a cluster based energy management system using deep reinforcement learning without affecting prosumer comfort v2x technologies and peer to peer energy trading
topic Energy management system
peer-to-peer energy trading
vehicle-to-home
multi-agent reinforcement learning
deep reinforcement learning
smart grids
url https://ieeexplore.ieee.org/document/10452350/
work_keys_str_mv AT meteyavuz optimizationofaclusterbasedenergymanagementsystemusingdeepreinforcementlearningwithoutaffectingprosumercomfortv2xtechnologiesandpeertopeerenergytrading
AT omercihankivanc optimizationofaclusterbasedenergymanagementsystemusingdeepreinforcementlearningwithoutaffectingprosumercomfortv2xtechnologiesandpeertopeerenergytrading