DBAFormer: A Double-Branch Attention Transformer for Long-Term Time Series Forecasting
Abstract The transformer-based approach excels in long-term series forecasting. These models leverage stacking structures and self-attention mechanisms, enabling them to effectively model dependencies in series data. While some approaches prioritize sparse attention to tackle the quadratic time comp...
Main Authors: | Ji Huang, Minbo Ma, Yongsheng Dai, Jie Hu, Shengdong Du |
---|---|
Format: | Article |
Language: | English |
Published: |
Springer Nature
2023-07-01
|
Series: | Human-Centric Intelligent Systems |
Subjects: | |
Online Access: | https://doi.org/10.1007/s44230-023-00037-z |
Similar Items
-
Enformer: Encoder-Based Sparse Periodic Self-Attention Time-Series Forecasting
by: Na Wang, et al.
Published: (2023-01-01) -
Time Series Prediction Based on LSTM-Attention-LSTM Model
by: Xianyun Wen, et al.
Published: (2023-01-01) -
Forecasting Nonlinear Systems with LSTM: Analysis and Comparison with EKF
by: Juan Pedro Llerena Caña, et al.
Published: (2021-03-01) -
Encoder–decoder-based image transformation approach for integrating multiple spatial forecasts
by: Hirotaka Hachiya, et al.
Published: (2023-06-01) -
Triple-Stage Attention-Based Multiple Parallel Connection Hybrid Neural Network Model for Conditional Time Series Forecasting
by: Yepeng Cheng, et al.
Published: (2021-01-01)