Dense Sampling of Time Series for Forecasting
A time series contain a large amount of information suitable for forecasting. Classical statistical and recent deep learning models have been widely used in a variety of forecasting applications. During the training data preparation stage, most models collect samples by sliding a fixed-sized window...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2022-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9831778/ |
_version_ | 1828524147672612864 |
---|---|
author | Il-Seok Oh Jin-Seon Lee |
author_facet | Il-Seok Oh Jin-Seon Lee |
author_sort | Il-Seok Oh |
collection | DOAJ |
description | A time series contain a large amount of information suitable for forecasting. Classical statistical and recent deep learning models have been widely used in a variety of forecasting applications. During the training data preparation stage, most models collect samples by sliding a fixed-sized window over the time axis of the input time series. We refer to this conventional method as “sparse sampling” because it cannot extract sufficient samples because it ignores another important axis representing the window size. In this study, a dense sampling method is proposed that extends the sampling space from one to two dimensions. The new space consists of time and window axes. Dense sampling provides several desirable effects, such as a larger training dataset, an intra-model ensemble, model-agnosticism, and an easier setting of the optimal window. The experiments were conducted using four real datasets: Bitcoin price, influenza-like illness, household electric power consumption, and wind speed. The mean absolute percentage error was measured extensively in terms of varying window sizes, horizons, and lengths of time series. The resulting data showed that dense sampling significantly and consistently outperformed sparse sampling. The source codes and datasets are available at <uri>https://github.com/isoh24/Dense-sampling-time-series</uri>. |
first_indexed | 2024-12-11T20:40:59Z |
format | Article |
id | doaj.art-70ae4f8bb9aa407698d02aa9e5eef505 |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-12-11T20:40:59Z |
publishDate | 2022-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-70ae4f8bb9aa407698d02aa9e5eef5052022-12-22T00:51:30ZengIEEEIEEE Access2169-35362022-01-0110755717558010.1109/ACCESS.2022.31916689831778Dense Sampling of Time Series for ForecastingIl-Seok Oh0https://orcid.org/0000-0002-8823-0438Jin-Seon Lee1https://orcid.org/0000-0003-0914-8258Division of Computer Science and Engineering, Jeonbuk National University, Jeonju-si, South KoreaDepartment of Information Security, Woosuk University, Wanju-gun, South KoreaA time series contain a large amount of information suitable for forecasting. Classical statistical and recent deep learning models have been widely used in a variety of forecasting applications. During the training data preparation stage, most models collect samples by sliding a fixed-sized window over the time axis of the input time series. We refer to this conventional method as “sparse sampling” because it cannot extract sufficient samples because it ignores another important axis representing the window size. In this study, a dense sampling method is proposed that extends the sampling space from one to two dimensions. The new space consists of time and window axes. Dense sampling provides several desirable effects, such as a larger training dataset, an intra-model ensemble, model-agnosticism, and an easier setting of the optimal window. The experiments were conducted using four real datasets: Bitcoin price, influenza-like illness, household electric power consumption, and wind speed. The mean absolute percentage error was measured extensively in terms of varying window sizes, horizons, and lengths of time series. The resulting data showed that dense sampling significantly and consistently outperformed sparse sampling. The source codes and datasets are available at <uri>https://github.com/isoh24/Dense-sampling-time-series</uri>.https://ieeexplore.ieee.org/document/9831778/Deep learningforecasting problemLSTMtime seriestraining data sampling |
spellingShingle | Il-Seok Oh Jin-Seon Lee Dense Sampling of Time Series for Forecasting IEEE Access Deep learning forecasting problem LSTM time series training data sampling |
title | Dense Sampling of Time Series for Forecasting |
title_full | Dense Sampling of Time Series for Forecasting |
title_fullStr | Dense Sampling of Time Series for Forecasting |
title_full_unstemmed | Dense Sampling of Time Series for Forecasting |
title_short | Dense Sampling of Time Series for Forecasting |
title_sort | dense sampling of time series for forecasting |
topic | Deep learning forecasting problem LSTM time series training data sampling |
url | https://ieeexplore.ieee.org/document/9831778/ |
work_keys_str_mv | AT ilseokoh densesamplingoftimeseriesforforecasting AT jinseonlee densesamplingoftimeseriesforforecasting |