DBAFormer: A Double-Branch Attention Transformer for Long-Term Time Series Forecasting

Abstract The transformer-based approach excels in long-term series forecasting. These models leverage stacking structures and self-attention mechanisms, enabling them to effectively model dependencies in series data. While some approaches prioritize sparse attention to tackle the quadratic time comp...

Full description

Bibliographic Details
Main Authors: Ji Huang, Minbo Ma, Yongsheng Dai, Jie Hu, Shengdong Du
Format: Article
Language:English
Published: Springer Nature 2023-07-01
Series:Human-Centric Intelligent Systems
Subjects:
Online Access:https://doi.org/10.1007/s44230-023-00037-z
_version_ 1797636859039318016
author Ji Huang
Minbo Ma
Yongsheng Dai
Jie Hu
Shengdong Du
author_facet Ji Huang
Minbo Ma
Yongsheng Dai
Jie Hu
Shengdong Du
author_sort Ji Huang
collection DOAJ
description Abstract The transformer-based approach excels in long-term series forecasting. These models leverage stacking structures and self-attention mechanisms, enabling them to effectively model dependencies in series data. While some approaches prioritize sparse attention to tackle the quadratic time complexity of self-attention, it can limit information utilization. We introduce a creative double-branch attention mechanism that simultaneously captures intricate dependencies in both temporal and variable perspectives. Moreover, we propose query-independent attention, taking into account the near-identical attention allocated by self-attention to different query positions. This enhances efficiency and reduces the impact of redundant information. We integrate the double-branch query-independent attention into popular transformer-based methods like Informer, Autoformer, and Non-stationary transformer. The results obtained from conducting experiments on six practical benchmarks consistently validate that our novel attention mechanism substantially improves the long-term series forecasting performance in contrast to the baseline approach.
first_indexed 2024-03-11T12:41:17Z
format Article
id doaj.art-f9d822f56ed942898d90f5e32e8aa237
institution Directory Open Access Journal
issn 2667-1336
language English
last_indexed 2024-03-11T12:41:17Z
publishDate 2023-07-01
publisher Springer Nature
record_format Article
series Human-Centric Intelligent Systems
spelling doaj.art-f9d822f56ed942898d90f5e32e8aa2372023-11-05T12:20:20ZengSpringer NatureHuman-Centric Intelligent Systems2667-13362023-07-013326327410.1007/s44230-023-00037-zDBAFormer: A Double-Branch Attention Transformer for Long-Term Time Series ForecastingJi Huang0Minbo Ma1Yongsheng Dai2Jie Hu3Shengdong Du4School of Computing and Artificial Intelligence, Southwest Jiaotong UniversitySchool of Computing and Artificial Intelligence, Southwest Jiaotong UniversitySchool of Computing and Artificial Intelligence, Southwest Jiaotong UniversitySchool of Computing and Artificial Intelligence, Southwest Jiaotong UniversitySchool of Computing and Artificial Intelligence, Southwest Jiaotong UniversityAbstract The transformer-based approach excels in long-term series forecasting. These models leverage stacking structures and self-attention mechanisms, enabling them to effectively model dependencies in series data. While some approaches prioritize sparse attention to tackle the quadratic time complexity of self-attention, it can limit information utilization. We introduce a creative double-branch attention mechanism that simultaneously captures intricate dependencies in both temporal and variable perspectives. Moreover, we propose query-independent attention, taking into account the near-identical attention allocated by self-attention to different query positions. This enhances efficiency and reduces the impact of redundant information. We integrate the double-branch query-independent attention into popular transformer-based methods like Informer, Autoformer, and Non-stationary transformer. The results obtained from conducting experiments on six practical benchmarks consistently validate that our novel attention mechanism substantially improves the long-term series forecasting performance in contrast to the baseline approach.https://doi.org/10.1007/s44230-023-00037-zTime series forecastingSelf-attentionEncoder-decoderTransformer-based models
spellingShingle Ji Huang
Minbo Ma
Yongsheng Dai
Jie Hu
Shengdong Du
DBAFormer: A Double-Branch Attention Transformer for Long-Term Time Series Forecasting
Human-Centric Intelligent Systems
Time series forecasting
Self-attention
Encoder-decoder
Transformer-based models
title DBAFormer: A Double-Branch Attention Transformer for Long-Term Time Series Forecasting
title_full DBAFormer: A Double-Branch Attention Transformer for Long-Term Time Series Forecasting
title_fullStr DBAFormer: A Double-Branch Attention Transformer for Long-Term Time Series Forecasting
title_full_unstemmed DBAFormer: A Double-Branch Attention Transformer for Long-Term Time Series Forecasting
title_short DBAFormer: A Double-Branch Attention Transformer for Long-Term Time Series Forecasting
title_sort dbaformer a double branch attention transformer for long term time series forecasting
topic Time series forecasting
Self-attention
Encoder-decoder
Transformer-based models
url https://doi.org/10.1007/s44230-023-00037-z
work_keys_str_mv AT jihuang dbaformeradoublebranchattentiontransformerforlongtermtimeseriesforecasting
AT minboma dbaformeradoublebranchattentiontransformerforlongtermtimeseriesforecasting
AT yongshengdai dbaformeradoublebranchattentiontransformerforlongtermtimeseriesforecasting
AT jiehu dbaformeradoublebranchattentiontransformerforlongtermtimeseriesforecasting
AT shengdongdu dbaformeradoublebranchattentiontransformerforlongtermtimeseriesforecasting