Transformer-Based Model for Electrical Load Forecasting
Amongst energy-related CO<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mrow></mrow><mn>2</mn></msub></semantics></math></inline-formula> emissions, electr...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-07-01
|
Series: | Energies |
Subjects: | |
Online Access: | https://www.mdpi.com/1996-1073/15/14/4993 |
_version_ | 1797406669927350272 |
---|---|
author | Alexandra L’Heureux Katarina Grolinger Miriam A. M. Capretz |
author_facet | Alexandra L’Heureux Katarina Grolinger Miriam A. M. Capretz |
author_sort | Alexandra L’Heureux |
collection | DOAJ |
description | Amongst energy-related CO<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mrow></mrow><mn>2</mn></msub></semantics></math></inline-formula> emissions, electricity is the largest single contributor, and with the proliferation of electric vehicles and other developments, energy use is expected to increase. Load forecasting is essential for combating these issues as it balances demand and production and contributes to energy management. Current state-of-the-art solutions such as recurrent neural networks (RNNs) and sequence-to-sequence algorithms (Seq2Seq) are highly accurate, but most studies examine them on a single data stream. On the other hand, in natural language processing (NLP), transformer architecture has become the dominant technique, outperforming RNN and Seq2Seq algorithms while also allowing parallelization. Consequently, this paper proposes a transformer-based architecture for load forecasting by modifying the NLP transformer workflow, adding N-space transformation, and designing a novel technique for handling contextual features. Moreover, in contrast to most load forecasting studies, we evaluate the proposed solution on different data streams under various forecasting horizons and input window lengths in order to ensure result reproducibility. Results show that the proposed approach successfully handles time series with contextual data and outperforms the state-of-the-art Seq2Seq models. |
first_indexed | 2024-03-09T03:30:00Z |
format | Article |
id | doaj.art-e18d6b045cc54ccc92e26662a3ad9eb9 |
institution | Directory Open Access Journal |
issn | 1996-1073 |
language | English |
last_indexed | 2024-03-09T03:30:00Z |
publishDate | 2022-07-01 |
publisher | MDPI AG |
record_format | Article |
series | Energies |
spelling | doaj.art-e18d6b045cc54ccc92e26662a3ad9eb92023-12-03T14:58:15ZengMDPI AGEnergies1996-10732022-07-011514499310.3390/en15144993Transformer-Based Model for Electrical Load ForecastingAlexandra L’Heureux0Katarina Grolinger1Miriam A. M. Capretz2Department of Electrical and Computer Engineering, The University of Western Ontario, London, ON N6A 5B9, CanadaDepartment of Electrical and Computer Engineering, The University of Western Ontario, London, ON N6A 5B9, CanadaDepartment of Electrical and Computer Engineering, The University of Western Ontario, London, ON N6A 5B9, CanadaAmongst energy-related CO<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mrow></mrow><mn>2</mn></msub></semantics></math></inline-formula> emissions, electricity is the largest single contributor, and with the proliferation of electric vehicles and other developments, energy use is expected to increase. Load forecasting is essential for combating these issues as it balances demand and production and contributes to energy management. Current state-of-the-art solutions such as recurrent neural networks (RNNs) and sequence-to-sequence algorithms (Seq2Seq) are highly accurate, but most studies examine them on a single data stream. On the other hand, in natural language processing (NLP), transformer architecture has become the dominant technique, outperforming RNN and Seq2Seq algorithms while also allowing parallelization. Consequently, this paper proposes a transformer-based architecture for load forecasting by modifying the NLP transformer workflow, adding N-space transformation, and designing a novel technique for handling contextual features. Moreover, in contrast to most load forecasting studies, we evaluate the proposed solution on different data streams under various forecasting horizons and input window lengths in order to ensure result reproducibility. Results show that the proposed approach successfully handles time series with contextual data and outperforms the state-of-the-art Seq2Seq models.https://www.mdpi.com/1996-1073/15/14/4993electrical load forecastingdeep learningtransformer architecturemachine learningsequence-to-sequence model |
spellingShingle | Alexandra L’Heureux Katarina Grolinger Miriam A. M. Capretz Transformer-Based Model for Electrical Load Forecasting Energies electrical load forecasting deep learning transformer architecture machine learning sequence-to-sequence model |
title | Transformer-Based Model for Electrical Load Forecasting |
title_full | Transformer-Based Model for Electrical Load Forecasting |
title_fullStr | Transformer-Based Model for Electrical Load Forecasting |
title_full_unstemmed | Transformer-Based Model for Electrical Load Forecasting |
title_short | Transformer-Based Model for Electrical Load Forecasting |
title_sort | transformer based model for electrical load forecasting |
topic | electrical load forecasting deep learning transformer architecture machine learning sequence-to-sequence model |
url | https://www.mdpi.com/1996-1073/15/14/4993 |
work_keys_str_mv | AT alexandralheureux transformerbasedmodelforelectricalloadforecasting AT katarinagrolinger transformerbasedmodelforelectricalloadforecasting AT miriamamcapretz transformerbasedmodelforelectricalloadforecasting |