Deep Q-Learning-Based Smart Scheduling of EVs for Demand Response in Smart Grids
Economic and policy factors are driving the continuous increase in the adoption and usage of electrical vehicles (EVs). However, despite being a cleaner alternative to combustion engine vehicles, EVs have negative impacts on the lifespan of microgrid equipment and energy balance due to increased pow...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2024-02-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/14/4/1421 |
_version_ | 1827344299874844672 |
---|---|
author | Viorica Rozina Chifu Tudor Cioara Cristina Bianca Pop Horia Gabriel Rusu Ionut Anghel |
author_facet | Viorica Rozina Chifu Tudor Cioara Cristina Bianca Pop Horia Gabriel Rusu Ionut Anghel |
author_sort | Viorica Rozina Chifu |
collection | DOAJ |
description | Economic and policy factors are driving the continuous increase in the adoption and usage of electrical vehicles (EVs). However, despite being a cleaner alternative to combustion engine vehicles, EVs have negative impacts on the lifespan of microgrid equipment and energy balance due to increased power demands and the timing of their usage. In our view, grid management should leverage on EV scheduling flexibility to support local network balancing through active participation in demand response programs. In this paper, we propose a model-free solution, leveraging deep Q-learning to schedule the charging and discharging activities of EVs within a microgrid to align with a target energy profile provided by the distribution system operator. We adapted the Bellman equation to assess the value of a state based on specific rewards for EV scheduling actions and used a neural network to estimate Q-values for available actions and the epsilon-greedy algorithm to balance exploitation and exploration to meet the target energy profile. The results are promising, showing the effectiveness of the proposed solution in scheduling the charging and discharging actions for a fleet of 30 EVs to align with the target energy profile in demand response programs, achieving a Pearson coefficient of 0.99. This solution also demonstrates a high degree of adaptability in effectively managing scheduling situations for EVs that involve dynamicity, influenced by various state-of-charge distributions and e-mobility features. Adaptability is achieved solely through learning from data without requiring prior knowledge, configurations, or fine-tuning. |
first_indexed | 2024-03-07T22:44:05Z |
format | Article |
id | doaj.art-efae7d0750c6433b95e8dbf1fc5802e6 |
institution | Directory Open Access Journal |
issn | 2076-3417 |
language | English |
last_indexed | 2024-03-07T22:44:05Z |
publishDate | 2024-02-01 |
publisher | MDPI AG |
record_format | Article |
series | Applied Sciences |
spelling | doaj.art-efae7d0750c6433b95e8dbf1fc5802e62024-02-23T15:05:56ZengMDPI AGApplied Sciences2076-34172024-02-01144142110.3390/app14041421Deep Q-Learning-Based Smart Scheduling of EVs for Demand Response in Smart GridsViorica Rozina Chifu0Tudor Cioara1Cristina Bianca Pop2Horia Gabriel Rusu3Ionut Anghel4Computer Science Department, Technical University of Cluj-Napoca, Memorandumului 28, 400114 Cluj-Napoca, RomaniaComputer Science Department, Technical University of Cluj-Napoca, Memorandumului 28, 400114 Cluj-Napoca, RomaniaComputer Science Department, Technical University of Cluj-Napoca, Memorandumului 28, 400114 Cluj-Napoca, RomaniaComputer Science Department, Technical University of Cluj-Napoca, Memorandumului 28, 400114 Cluj-Napoca, RomaniaComputer Science Department, Technical University of Cluj-Napoca, Memorandumului 28, 400114 Cluj-Napoca, RomaniaEconomic and policy factors are driving the continuous increase in the adoption and usage of electrical vehicles (EVs). However, despite being a cleaner alternative to combustion engine vehicles, EVs have negative impacts on the lifespan of microgrid equipment and energy balance due to increased power demands and the timing of their usage. In our view, grid management should leverage on EV scheduling flexibility to support local network balancing through active participation in demand response programs. In this paper, we propose a model-free solution, leveraging deep Q-learning to schedule the charging and discharging activities of EVs within a microgrid to align with a target energy profile provided by the distribution system operator. We adapted the Bellman equation to assess the value of a state based on specific rewards for EV scheduling actions and used a neural network to estimate Q-values for available actions and the epsilon-greedy algorithm to balance exploitation and exploration to meet the target energy profile. The results are promising, showing the effectiveness of the proposed solution in scheduling the charging and discharging actions for a fleet of 30 EVs to align with the target energy profile in demand response programs, achieving a Pearson coefficient of 0.99. This solution also demonstrates a high degree of adaptability in effectively managing scheduling situations for EVs that involve dynamicity, influenced by various state-of-charge distributions and e-mobility features. Adaptability is achieved solely through learning from data without requiring prior knowledge, configurations, or fine-tuning.https://www.mdpi.com/2076-3417/14/4/1421deep Q-learningEV schedulingvehicle to griddemand responsereinforcement learningmodel-free optimization |
spellingShingle | Viorica Rozina Chifu Tudor Cioara Cristina Bianca Pop Horia Gabriel Rusu Ionut Anghel Deep Q-Learning-Based Smart Scheduling of EVs for Demand Response in Smart Grids Applied Sciences deep Q-learning EV scheduling vehicle to grid demand response reinforcement learning model-free optimization |
title | Deep Q-Learning-Based Smart Scheduling of EVs for Demand Response in Smart Grids |
title_full | Deep Q-Learning-Based Smart Scheduling of EVs for Demand Response in Smart Grids |
title_fullStr | Deep Q-Learning-Based Smart Scheduling of EVs for Demand Response in Smart Grids |
title_full_unstemmed | Deep Q-Learning-Based Smart Scheduling of EVs for Demand Response in Smart Grids |
title_short | Deep Q-Learning-Based Smart Scheduling of EVs for Demand Response in Smart Grids |
title_sort | deep q learning based smart scheduling of evs for demand response in smart grids |
topic | deep Q-learning EV scheduling vehicle to grid demand response reinforcement learning model-free optimization |
url | https://www.mdpi.com/2076-3417/14/4/1421 |
work_keys_str_mv | AT vioricarozinachifu deepqlearningbasedsmartschedulingofevsfordemandresponseinsmartgrids AT tudorcioara deepqlearningbasedsmartschedulingofevsfordemandresponseinsmartgrids AT cristinabiancapop deepqlearningbasedsmartschedulingofevsfordemandresponseinsmartgrids AT horiagabrielrusu deepqlearningbasedsmartschedulingofevsfordemandresponseinsmartgrids AT ionutanghel deepqlearningbasedsmartschedulingofevsfordemandresponseinsmartgrids |