Analyzing the Performance of Transformers for the Prediction of the Blood Glucose Level Considering Imputation and Smoothing

In this paper we investigate the effect of two preprocessing techniques, data imputation and smoothing, in the prediction of blood glucose level in type 1 diabetes patients, using a novel deep learning model called Transformer. We train three models: XGBoost, a one-dimensional convolutional neural n...

Full description

Bibliographic Details
Main Authors: Edgar Acuna, Roxana Aparicio, Velcy Palomino
Format: Article
Language:English
Published: MDPI AG 2023-02-01
Series:Big Data and Cognitive Computing
Subjects:
Online Access:https://www.mdpi.com/2504-2289/7/1/41
_version_ 1797613504091389952
author Edgar Acuna
Roxana Aparicio
Velcy Palomino
author_facet Edgar Acuna
Roxana Aparicio
Velcy Palomino
author_sort Edgar Acuna
collection DOAJ
description In this paper we investigate the effect of two preprocessing techniques, data imputation and smoothing, in the prediction of blood glucose level in type 1 diabetes patients, using a novel deep learning model called Transformer. We train three models: XGBoost, a one-dimensional convolutional neural network (1D-CNN), and the Transformer model to predict future blood glucose levels for a 30-min horizon using a 60-min time series history in the OhioT1DM dataset. We also compare four methods of handling missing time series data during the model training: hourly mean, linear interpolation, cubic interpolation, and spline interpolation; and two smoothing techniques: Kalman smoothing and smoothing splines. Our experiments show that the Transformer performs better than XGBoost and 1D-CNN when only continuous glucose monitoring (CGM) is used as a predictor, and that it is very competitive against XGBoost when CGM and carbohydrate intake from the meal are used to predict blood glucose level. Overall, our results are more accurate than those appearing in the literature.
first_indexed 2024-03-11T06:55:40Z
format Article
id doaj.art-9afeeb7e75af453cbfb7632773af4f7a
institution Directory Open Access Journal
issn 2504-2289
language English
last_indexed 2024-03-11T06:55:40Z
publishDate 2023-02-01
publisher MDPI AG
record_format Article
series Big Data and Cognitive Computing
spelling doaj.art-9afeeb7e75af453cbfb7632773af4f7a2023-11-17T09:37:08ZengMDPI AGBig Data and Cognitive Computing2504-22892023-02-01714110.3390/bdcc7010041Analyzing the Performance of Transformers for the Prediction of the Blood Glucose Level Considering Imputation and SmoothingEdgar Acuna0Roxana Aparicio1Velcy Palomino2Mathematical Sciences Department, University of Puerto Rico at Mayaguez, Mayaguez PR00681, Puerto RicoComputer Science Department, University of Puerto Rico at Bayamon, Bayamon PR00959, Puerto RicoComputing and Information Sciences and Engineering, University of Puerto Rico at Mayaguez, Mayaguez PR00681, Puerto RicoIn this paper we investigate the effect of two preprocessing techniques, data imputation and smoothing, in the prediction of blood glucose level in type 1 diabetes patients, using a novel deep learning model called Transformer. We train three models: XGBoost, a one-dimensional convolutional neural network (1D-CNN), and the Transformer model to predict future blood glucose levels for a 30-min horizon using a 60-min time series history in the OhioT1DM dataset. We also compare four methods of handling missing time series data during the model training: hourly mean, linear interpolation, cubic interpolation, and spline interpolation; and two smoothing techniques: Kalman smoothing and smoothing splines. Our experiments show that the Transformer performs better than XGBoost and 1D-CNN when only continuous glucose monitoring (CGM) is used as a predictor, and that it is very competitive against XGBoost when CGM and carbohydrate intake from the meal are used to predict blood glucose level. Overall, our results are more accurate than those appearing in the literature.https://www.mdpi.com/2504-2289/7/1/41diabetesTransformer1D-CNNXGBoostingglucose predictionimputation
spellingShingle Edgar Acuna
Roxana Aparicio
Velcy Palomino
Analyzing the Performance of Transformers for the Prediction of the Blood Glucose Level Considering Imputation and Smoothing
Big Data and Cognitive Computing
diabetes
Transformer
1D-CNN
XGBoosting
glucose prediction
imputation
title Analyzing the Performance of Transformers for the Prediction of the Blood Glucose Level Considering Imputation and Smoothing
title_full Analyzing the Performance of Transformers for the Prediction of the Blood Glucose Level Considering Imputation and Smoothing
title_fullStr Analyzing the Performance of Transformers for the Prediction of the Blood Glucose Level Considering Imputation and Smoothing
title_full_unstemmed Analyzing the Performance of Transformers for the Prediction of the Blood Glucose Level Considering Imputation and Smoothing
title_short Analyzing the Performance of Transformers for the Prediction of the Blood Glucose Level Considering Imputation and Smoothing
title_sort analyzing the performance of transformers for the prediction of the blood glucose level considering imputation and smoothing
topic diabetes
Transformer
1D-CNN
XGBoosting
glucose prediction
imputation
url https://www.mdpi.com/2504-2289/7/1/41
work_keys_str_mv AT edgaracuna analyzingtheperformanceoftransformersforthepredictionofthebloodglucoselevelconsideringimputationandsmoothing
AT roxanaaparicio analyzingtheperformanceoftransformersforthepredictionofthebloodglucoselevelconsideringimputationandsmoothing
AT velcypalomino analyzingtheperformanceoftransformersforthepredictionofthebloodglucoselevelconsideringimputationandsmoothing