LS-LSTM-AE: Power load forecasting via Long-Short series features and LSTM-Autoencoder
Aiming at weak representation ability and severe loss of time series features in the traditional methods when facing large-scale and complex power load forecasting tasks, an LSTM-Autoencoder model that integrates long-term and short-term features of the samples is proposed for load forecasting. The...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Elsevier
2022-04-01
|
Series: | Energy Reports |
Subjects: | |
Online Access: | http://www.sciencedirect.com/science/article/pii/S2352484721013196 |
_version_ | 1818022630473596928 |
---|---|
author | Xin Tong Jingya Wang Changlin Zhang Teng Wu Haitao Wang Yu Wang |
author_facet | Xin Tong Jingya Wang Changlin Zhang Teng Wu Haitao Wang Yu Wang |
author_sort | Xin Tong |
collection | DOAJ |
description | Aiming at weak representation ability and severe loss of time series features in the traditional methods when facing large-scale and complex power load forecasting tasks, an LSTM-Autoencoder model that integrates long-term and short-term features of the samples is proposed for load forecasting. The encoder part simultaneously receives long time series and short time series as input to extract time series features of different levels and generate related latent vectors. The decoder tries to reconstruct the input sequence while outputting the prediction results to ensure that the latent vectors are meaningful. In addition, the model also uses a mixture of supervised and unsupervised training methods. Experiments based on a publicly available dataset from Alberta Electric System Operator showed that the method presented in this research is superior to many existing mainstream methods, with a mean absolute error of less than 52MW between the prediction results and the actual load values. |
first_indexed | 2024-12-10T03:31:29Z |
format | Article |
id | doaj.art-9239b705619b4a01b4ec31a59a74f368 |
institution | Directory Open Access Journal |
issn | 2352-4847 |
language | English |
last_indexed | 2024-12-10T03:31:29Z |
publishDate | 2022-04-01 |
publisher | Elsevier |
record_format | Article |
series | Energy Reports |
spelling | doaj.art-9239b705619b4a01b4ec31a59a74f3682022-12-22T02:03:49ZengElsevierEnergy Reports2352-48472022-04-018596603LS-LSTM-AE: Power load forecasting via Long-Short series features and LSTM-AutoencoderXin Tong0Jingya Wang1Changlin Zhang2Teng Wu3Haitao Wang4Yu Wang5People’s Public Security University of China, Xicheng District, Beijing 100038, ChinaPeople’s Public Security University of China, Xicheng District, Beijing 100038, China; Corresponding author.Henan Police College, Jinshui District, Zhengzhou 450046, ChinaKanazawa University, Kakuma Machi, Kanazawa, Shi 920-1164, JapanChina People’s Police University, Anci District, Langfang 065000, ChinaPeople’s Public Security University of China, Xicheng District, Beijing 100038, ChinaAiming at weak representation ability and severe loss of time series features in the traditional methods when facing large-scale and complex power load forecasting tasks, an LSTM-Autoencoder model that integrates long-term and short-term features of the samples is proposed for load forecasting. The encoder part simultaneously receives long time series and short time series as input to extract time series features of different levels and generate related latent vectors. The decoder tries to reconstruct the input sequence while outputting the prediction results to ensure that the latent vectors are meaningful. In addition, the model also uses a mixture of supervised and unsupervised training methods. Experiments based on a publicly available dataset from Alberta Electric System Operator showed that the method presented in this research is superior to many existing mainstream methods, with a mean absolute error of less than 52MW between the prediction results and the actual load values.http://www.sciencedirect.com/science/article/pii/S2352484721013196Load forecastingDeep learningLSTMAutoencoderUnsupervised learningSupervised learning |
spellingShingle | Xin Tong Jingya Wang Changlin Zhang Teng Wu Haitao Wang Yu Wang LS-LSTM-AE: Power load forecasting via Long-Short series features and LSTM-Autoencoder Energy Reports Load forecasting Deep learning LSTM Autoencoder Unsupervised learning Supervised learning |
title | LS-LSTM-AE: Power load forecasting via Long-Short series features and LSTM-Autoencoder |
title_full | LS-LSTM-AE: Power load forecasting via Long-Short series features and LSTM-Autoencoder |
title_fullStr | LS-LSTM-AE: Power load forecasting via Long-Short series features and LSTM-Autoencoder |
title_full_unstemmed | LS-LSTM-AE: Power load forecasting via Long-Short series features and LSTM-Autoencoder |
title_short | LS-LSTM-AE: Power load forecasting via Long-Short series features and LSTM-Autoencoder |
title_sort | ls lstm ae power load forecasting via long short series features and lstm autoencoder |
topic | Load forecasting Deep learning LSTM Autoencoder Unsupervised learning Supervised learning |
url | http://www.sciencedirect.com/science/article/pii/S2352484721013196 |
work_keys_str_mv | AT xintong lslstmaepowerloadforecastingvialongshortseriesfeaturesandlstmautoencoder AT jingyawang lslstmaepowerloadforecastingvialongshortseriesfeaturesandlstmautoencoder AT changlinzhang lslstmaepowerloadforecastingvialongshortseriesfeaturesandlstmautoencoder AT tengwu lslstmaepowerloadforecastingvialongshortseriesfeaturesandlstmautoencoder AT haitaowang lslstmaepowerloadforecastingvialongshortseriesfeaturesandlstmautoencoder AT yuwang lslstmaepowerloadforecastingvialongshortseriesfeaturesandlstmautoencoder |