Deep Learning for Load Forecasting: Sequence to Sequence Recurrent Neural Networks With Attention
The biggest contributor to global warming is energy production and use. Moreover, a push for electrical vehicle and other economic developments are expected to further increase energy use. To combat these challenges, electrical load forecasting is essential as it supports energy production planning...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9006868/ |
_version_ | 1798001685853896704 |
---|---|
author | Ljubisa Sehovac Katarina Grolinger |
author_facet | Ljubisa Sehovac Katarina Grolinger |
author_sort | Ljubisa Sehovac |
collection | DOAJ |
description | The biggest contributor to global warming is energy production and use. Moreover, a push for electrical vehicle and other economic developments are expected to further increase energy use. To combat these challenges, electrical load forecasting is essential as it supports energy production planning and scheduling, assists with budgeting, and helps identify saving opportunities. Machine learning approaches commonly used for energy forecasting such as feedforward neural networks and support vector regression encounter challenges with capturing time dependencies. Consequently, this paper proposes Sequence to Sequence Recurrent Neural Network (S2S RNN) with Attention for electrical load forecasting. The S2S architecture from language translation is adapted for load forecasting and a corresponding sample generation approach is designed. RNN enables capturing time dependencies present in the load data and S2S model further improves time modeling by combining two RNNs: encoder and decoder. The attention mechanism alleviates the burden of connecting encoder and decoder. The experiments evaluated attention mechanisms with different RNN cells (vanilla, LSTM, and GRU) and with varied time horizons. Results show that S2S with Bahdanau attention outperforms other models. Accuracy decreases as forecasting horizon increases; however, longer input sequences do not always increase accuracy. |
first_indexed | 2024-04-11T11:41:30Z |
format | Article |
id | doaj.art-a5312d62cd34407390ea634ad66a5d6b |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-04-11T11:41:30Z |
publishDate | 2020-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-a5312d62cd34407390ea634ad66a5d6b2022-12-22T04:25:50ZengIEEEIEEE Access2169-35362020-01-018364113642610.1109/ACCESS.2020.29757389006868Deep Learning for Load Forecasting: Sequence to Sequence Recurrent Neural Networks With AttentionLjubisa Sehovac0https://orcid.org/0000-0001-5152-5390Katarina Grolinger1https://orcid.org/0000-0003-0062-8212Department of Electrical and Computer Engineering, Western University, London, ON, CanadaDepartment of Electrical and Computer Engineering, Western University, London, ON, CanadaThe biggest contributor to global warming is energy production and use. Moreover, a push for electrical vehicle and other economic developments are expected to further increase energy use. To combat these challenges, electrical load forecasting is essential as it supports energy production planning and scheduling, assists with budgeting, and helps identify saving opportunities. Machine learning approaches commonly used for energy forecasting such as feedforward neural networks and support vector regression encounter challenges with capturing time dependencies. Consequently, this paper proposes Sequence to Sequence Recurrent Neural Network (S2S RNN) with Attention for electrical load forecasting. The S2S architecture from language translation is adapted for load forecasting and a corresponding sample generation approach is designed. RNN enables capturing time dependencies present in the load data and S2S model further improves time modeling by combining two RNNs: encoder and decoder. The attention mechanism alleviates the burden of connecting encoder and decoder. The experiments evaluated attention mechanisms with different RNN cells (vanilla, LSTM, and GRU) and with varied time horizons. Results show that S2S with Bahdanau attention outperforms other models. Accuracy decreases as forecasting horizon increases; however, longer input sequences do not always increase accuracy.https://ieeexplore.ieee.org/document/9006868/Attention mechanismgated recurrent unitsGRUload forecastinglong short-term memoryLSTM |
spellingShingle | Ljubisa Sehovac Katarina Grolinger Deep Learning for Load Forecasting: Sequence to Sequence Recurrent Neural Networks With Attention IEEE Access Attention mechanism gated recurrent units GRU load forecasting long short-term memory LSTM |
title | Deep Learning for Load Forecasting: Sequence to Sequence Recurrent Neural Networks With Attention |
title_full | Deep Learning for Load Forecasting: Sequence to Sequence Recurrent Neural Networks With Attention |
title_fullStr | Deep Learning for Load Forecasting: Sequence to Sequence Recurrent Neural Networks With Attention |
title_full_unstemmed | Deep Learning for Load Forecasting: Sequence to Sequence Recurrent Neural Networks With Attention |
title_short | Deep Learning for Load Forecasting: Sequence to Sequence Recurrent Neural Networks With Attention |
title_sort | deep learning for load forecasting sequence to sequence recurrent neural networks with attention |
topic | Attention mechanism gated recurrent units GRU load forecasting long short-term memory LSTM |
url | https://ieeexplore.ieee.org/document/9006868/ |
work_keys_str_mv | AT ljubisasehovac deeplearningforloadforecastingsequencetosequencerecurrentneuralnetworkswithattention AT katarinagrolinger deeplearningforloadforecastingsequencetosequencerecurrentneuralnetworkswithattention |