Recurrent neural networks for time series prediction

<p>Attempting to predict the future long precedes the time where we could first quantify much of our present. But nowadays, performing long-term and reliable time series forecasting remains as relevant as ever across a vast number of application domains, including astronomy, healthcare, meteor...

Full description

Bibliographic Details
Main Author: Perez Orozco, B
Format: Thesis
Language:English
Published: 2019
Subjects:
_version_ 1797105416200519680
author Perez Orozco, B
author_facet Perez Orozco, B
author_sort Perez Orozco, B
collection OXFORD
description <p>Attempting to predict the future long precedes the time where we could first quantify much of our present. But nowadays, performing long-term and reliable time series forecasting remains as relevant as ever across a vast number of application domains, including astronomy, healthcare, meteorology, physiology, energy systems, econometrics, finance and sociology, amongst many others. Practitioners act upon these forecasts, and it is thus mandatory that such forecasts are accompanied by well-calibrated uncertainty assessments so as to enable a better-informed decision-making process.</p> <p>This thesis proposes a novel time series forecasting framework, namely a scalable and general-purpose recurrent neural network approach which provides probabilistic predictions. Our method recasts time series forecasting as a symbolic sequence learning task by introducing a discrete encoding scheme over measurements. Such symbol sequence tasks have been most successfully dealt with by Long Short-Term Memory (LSTM) models and extensions, such as the sequence-to-sequence model, which form the core of our methodology. This presentation is accompanied by a large-scale experiment where we show the effectiveness of our method over 45 different datasets in comparison with state-of-the-art baselines, such as Gaussian Process Regression (GPR).</p> <p>Crucially, our framework also offers a number of extensions that address a broad range of use cases beyond univariate time series prediction. For example, we show how predictive densities over the occurrence of critical events within a predictive horizon can be inferred from our forecasts. Whereas this requires access to labelled data, we provide a complementary method for recognising and labelling prototypical time series segments on the basis of the feature embedding learnt by our neural networks. Such motifs can subsequently be used to build predictive densities and characterise time series. Lastly, we extend our framework to allow for multivariate forecasting, enabling efficient inference of joint predictive densities over multiple correlated time series.</p> <p>Finally, time series data scarcity constrains the choice of a forecasting model, especially as over-parameterised models such as recurrent neural networks cannot generalise from limited data. In this thesis, we propose a novel approach to infer accurate and reliable predictive distributions from scarce time series data. We achieve this through a recurrent neural network approach that infers a joint feature representation over the space of quantised time series by means of a compilation of auxiliary datasets. Cross-task knowledge transfer is readily enabled by assuming an ordinal regression approach, as quantised time series can then be characterised in terms of their ordinal bin sequences.</p>
first_indexed 2024-03-07T06:47:14Z
format Thesis
id oxford-uuid:fb471520-8541-4abb-95a0-7649809cac87
institution University of Oxford
language English
last_indexed 2024-03-07T06:47:14Z
publishDate 2019
record_format dspace
spelling oxford-uuid:fb471520-8541-4abb-95a0-7649809cac872022-03-27T13:12:30ZRecurrent neural networks for time series predictionThesishttp://purl.org/coar/resource_type/c_db06uuid:fb471520-8541-4abb-95a0-7649809cac87Machine LearningData scienceTime series forecastingEnglishORA Deposit2019Perez Orozco, B<p>Attempting to predict the future long precedes the time where we could first quantify much of our present. But nowadays, performing long-term and reliable time series forecasting remains as relevant as ever across a vast number of application domains, including astronomy, healthcare, meteorology, physiology, energy systems, econometrics, finance and sociology, amongst many others. Practitioners act upon these forecasts, and it is thus mandatory that such forecasts are accompanied by well-calibrated uncertainty assessments so as to enable a better-informed decision-making process.</p> <p>This thesis proposes a novel time series forecasting framework, namely a scalable and general-purpose recurrent neural network approach which provides probabilistic predictions. Our method recasts time series forecasting as a symbolic sequence learning task by introducing a discrete encoding scheme over measurements. Such symbol sequence tasks have been most successfully dealt with by Long Short-Term Memory (LSTM) models and extensions, such as the sequence-to-sequence model, which form the core of our methodology. This presentation is accompanied by a large-scale experiment where we show the effectiveness of our method over 45 different datasets in comparison with state-of-the-art baselines, such as Gaussian Process Regression (GPR).</p> <p>Crucially, our framework also offers a number of extensions that address a broad range of use cases beyond univariate time series prediction. For example, we show how predictive densities over the occurrence of critical events within a predictive horizon can be inferred from our forecasts. Whereas this requires access to labelled data, we provide a complementary method for recognising and labelling prototypical time series segments on the basis of the feature embedding learnt by our neural networks. Such motifs can subsequently be used to build predictive densities and characterise time series. Lastly, we extend our framework to allow for multivariate forecasting, enabling efficient inference of joint predictive densities over multiple correlated time series.</p> <p>Finally, time series data scarcity constrains the choice of a forecasting model, especially as over-parameterised models such as recurrent neural networks cannot generalise from limited data. In this thesis, we propose a novel approach to infer accurate and reliable predictive distributions from scarce time series data. We achieve this through a recurrent neural network approach that infers a joint feature representation over the space of quantised time series by means of a compilation of auxiliary datasets. Cross-task knowledge transfer is readily enabled by assuming an ordinal regression approach, as quantised time series can then be characterised in terms of their ordinal bin sequences.</p>
spellingShingle Machine Learning
Data science
Time series forecasting
Perez Orozco, B
Recurrent neural networks for time series prediction
title Recurrent neural networks for time series prediction
title_full Recurrent neural networks for time series prediction
title_fullStr Recurrent neural networks for time series prediction
title_full_unstemmed Recurrent neural networks for time series prediction
title_short Recurrent neural networks for time series prediction
title_sort recurrent neural networks for time series prediction
topic Machine Learning
Data science
Time series forecasting
work_keys_str_mv AT perezorozcob recurrentneuralnetworksfortimeseriesprediction