Performance Analysis of Long Short-Term Memory Predictive Neural Networks on Time Series Data

Long short-term memory neural networks have been proposed as a means of creating accurate models from large time series data originating from various fields. These models can further be utilized for prediction, control, or anomaly-detection algorithms. However, finding the optimal hyperparameters to...

Full description

Bibliographic Details
Main Authors: Roland Bolboacă, Piroska Haller
Format: Article
Language:English
Published: MDPI AG 2023-03-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/11/6/1432
_version_ 1827748922643185664
author Roland Bolboacă
Piroska Haller
author_facet Roland Bolboacă
Piroska Haller
author_sort Roland Bolboacă
collection DOAJ
description Long short-term memory neural networks have been proposed as a means of creating accurate models from large time series data originating from various fields. These models can further be utilized for prediction, control, or anomaly-detection algorithms. However, finding the optimal hyperparameters to maximize different performance criteria remains a challenge for both novice and experienced users. Hyperparameter optimization algorithms can often be a resource-intensive and time-consuming task, particularly when the impact of the hyperparameters on the performance of the neural network is not comprehended or known. Teacher forcing denotes a procedure that involves feeding the ground truth output from the previous time-step as input to the current time-step during training, while during testing feeding back the predicted values. This paper presents a comprehensive examination of the impact of hyperparameters on long short-term neural networks, with and without teacher forcing, on prediction performance. The study includes testing long short-term memory neural networks, with two variations of teacher forcing, in two prediction modes, using two configurations (i.e., multi-input single-output and multi-input multi-output) on a well-known chemical process simulation dataset. Furthermore, this paper demonstrates the applicability of a long short-term memory neural network with a modified teacher forcing approach in a process state monitoring system. Over 100,000 experiments were conducted with varying hyperparameters and in multiple neural network operation modes, revealing the direct impact of each tested hyperparameter on the training and testing procedures.
first_indexed 2024-03-11T06:13:43Z
format Article
id doaj.art-cb868fa9678046eb81ad3fada9931410
institution Directory Open Access Journal
issn 2227-7390
language English
last_indexed 2024-03-11T06:13:43Z
publishDate 2023-03-01
publisher MDPI AG
record_format Article
series Mathematics
spelling doaj.art-cb868fa9678046eb81ad3fada99314102023-11-17T12:28:32ZengMDPI AGMathematics2227-73902023-03-01116143210.3390/math11061432Performance Analysis of Long Short-Term Memory Predictive Neural Networks on Time Series DataRoland Bolboacă0Piroska Haller1The Faculty of Engineering and Information Technology, George Emil Palade University of Medicine, Pharmacy, Science, and Technology of Târgu Mureş, 540139 Târgu Mureş, RomaniaThe Faculty of Engineering and Information Technology, George Emil Palade University of Medicine, Pharmacy, Science, and Technology of Târgu Mureş, 540139 Târgu Mureş, RomaniaLong short-term memory neural networks have been proposed as a means of creating accurate models from large time series data originating from various fields. These models can further be utilized for prediction, control, or anomaly-detection algorithms. However, finding the optimal hyperparameters to maximize different performance criteria remains a challenge for both novice and experienced users. Hyperparameter optimization algorithms can often be a resource-intensive and time-consuming task, particularly when the impact of the hyperparameters on the performance of the neural network is not comprehended or known. Teacher forcing denotes a procedure that involves feeding the ground truth output from the previous time-step as input to the current time-step during training, while during testing feeding back the predicted values. This paper presents a comprehensive examination of the impact of hyperparameters on long short-term neural networks, with and without teacher forcing, on prediction performance. The study includes testing long short-term memory neural networks, with two variations of teacher forcing, in two prediction modes, using two configurations (i.e., multi-input single-output and multi-input multi-output) on a well-known chemical process simulation dataset. Furthermore, this paper demonstrates the applicability of a long short-term memory neural network with a modified teacher forcing approach in a process state monitoring system. Over 100,000 experiments were conducted with varying hyperparameters and in multiple neural network operation modes, revealing the direct impact of each tested hyperparameter on the training and testing procedures.https://www.mdpi.com/2227-7390/11/6/1432long short-term memory (LSTM)recurrent neural network (RNN)teacher forcingpredictionperformance analysisbenchmarking
spellingShingle Roland Bolboacă
Piroska Haller
Performance Analysis of Long Short-Term Memory Predictive Neural Networks on Time Series Data
Mathematics
long short-term memory (LSTM)
recurrent neural network (RNN)
teacher forcing
prediction
performance analysis
benchmarking
title Performance Analysis of Long Short-Term Memory Predictive Neural Networks on Time Series Data
title_full Performance Analysis of Long Short-Term Memory Predictive Neural Networks on Time Series Data
title_fullStr Performance Analysis of Long Short-Term Memory Predictive Neural Networks on Time Series Data
title_full_unstemmed Performance Analysis of Long Short-Term Memory Predictive Neural Networks on Time Series Data
title_short Performance Analysis of Long Short-Term Memory Predictive Neural Networks on Time Series Data
title_sort performance analysis of long short term memory predictive neural networks on time series data
topic long short-term memory (LSTM)
recurrent neural network (RNN)
teacher forcing
prediction
performance analysis
benchmarking
url https://www.mdpi.com/2227-7390/11/6/1432
work_keys_str_mv AT rolandbolboaca performanceanalysisoflongshorttermmemorypredictiveneuralnetworksontimeseriesdata
AT piroskahaller performanceanalysisoflongshorttermmemorypredictiveneuralnetworksontimeseriesdata