Recurrent neural networks for time series prediction

<p>Attempting to predict the future long precedes the time where we could first quantify much of our present. But nowadays, performing long-term and reliable time series forecasting remains as relevant as ever across a vast number of application domains, including astronomy, healthcare, meteor...

Full description

Bibliographic Details
Main Author: Perez Orozco, B
Format: Thesis
Language:English
Published: 2019
Subjects:
Description
Summary:<p>Attempting to predict the future long precedes the time where we could first quantify much of our present. But nowadays, performing long-term and reliable time series forecasting remains as relevant as ever across a vast number of application domains, including astronomy, healthcare, meteorology, physiology, energy systems, econometrics, finance and sociology, amongst many others. Practitioners act upon these forecasts, and it is thus mandatory that such forecasts are accompanied by well-calibrated uncertainty assessments so as to enable a better-informed decision-making process.</p> <p>This thesis proposes a novel time series forecasting framework, namely a scalable and general-purpose recurrent neural network approach which provides probabilistic predictions. Our method recasts time series forecasting as a symbolic sequence learning task by introducing a discrete encoding scheme over measurements. Such symbol sequence tasks have been most successfully dealt with by Long Short-Term Memory (LSTM) models and extensions, such as the sequence-to-sequence model, which form the core of our methodology. This presentation is accompanied by a large-scale experiment where we show the effectiveness of our method over 45 different datasets in comparison with state-of-the-art baselines, such as Gaussian Process Regression (GPR).</p> <p>Crucially, our framework also offers a number of extensions that address a broad range of use cases beyond univariate time series prediction. For example, we show how predictive densities over the occurrence of critical events within a predictive horizon can be inferred from our forecasts. Whereas this requires access to labelled data, we provide a complementary method for recognising and labelling prototypical time series segments on the basis of the feature embedding learnt by our neural networks. Such motifs can subsequently be used to build predictive densities and characterise time series. Lastly, we extend our framework to allow for multivariate forecasting, enabling efficient inference of joint predictive densities over multiple correlated time series.</p> <p>Finally, time series data scarcity constrains the choice of a forecasting model, especially as over-parameterised models such as recurrent neural networks cannot generalise from limited data. In this thesis, we propose a novel approach to infer accurate and reliable predictive distributions from scarce time series data. We achieve this through a recurrent neural network approach that infers a joint feature representation over the space of quantised time series by means of a compilation of auxiliary datasets. Cross-task knowledge transfer is readily enabled by assuming an ordinal regression approach, as quantised time series can then be characterised in terms of their ordinal bin sequences.</p>