Evaluation of the EsteR Toolkit for COVID-19 Decision Support: Sensitivity Analysis and Usability Study
BackgroundDuring the COVID-19 pandemic, local health authorities were responsible for managing and reporting current cases in Germany. Since March 2020, employees had to contain the spread of COVID-19 by monitoring and contacting infected persons as well as tracing their cont...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
JMIR Publications
2023-06-01
|
Series: | JMIR Formative Research |
Online Access: | https://formative.jmir.org/2023/1/e44549 |
_version_ | 1797733937226711040 |
---|---|
author | Rieke Alpers Lisa Kühne Hong-Phuc Truong Hajo Zeeb Max Westphal Sonja Jäckle |
author_facet | Rieke Alpers Lisa Kühne Hong-Phuc Truong Hajo Zeeb Max Westphal Sonja Jäckle |
author_sort | Rieke Alpers |
collection | DOAJ |
description |
BackgroundDuring the COVID-19 pandemic, local health authorities were responsible for managing and reporting current cases in Germany. Since March 2020, employees had to contain the spread of COVID-19 by monitoring and contacting infected persons as well as tracing their contacts. In the EsteR project, we implemented existing and newly developed statistical models as decision support tools to assist in the work of the local health authorities.
ObjectiveThe main goal of this study was to validate the EsteR toolkit in two complementary ways: first, investigating the stability of the answers provided by our statistical tools regarding model parameters in the back end and, second, evaluating the usability and applicability of our web application in the front end by test users.
MethodsFor model stability assessment, a sensitivity analysis was carried out for all 5 developed statistical models. The default parameters of our models as well as the test ranges of the model parameters were based on a previous literature review on COVID-19 properties. The obtained answers resulting from different parameters were compared using dissimilarity metrics and visualized using contour plots. In addition, the parameter ranges of general model stability were identified. For the usability evaluation of the web application, cognitive walk-throughs and focus group interviews were conducted with 6 containment scouts located at 2 different local health authorities. They were first asked to complete small tasks with the tools and then express their general impressions of the web application.
ResultsThe simulation results showed that some statistical models were more sensitive to changes in their parameters than others. For each of the single-person use cases, we determined an area where the respective model could be rated as stable. In contrast, the results of the group use cases highly depended on the user inputs, and thus, no area of parameters with general model stability could be identified. We have also provided a detailed simulation report of the sensitivity analysis. In the user evaluation, the cognitive walk-throughs and focus group interviews revealed that the user interface needed to be simplified and more information was necessary as guidance. In general, the testers rated the web application as helpful, especially for new employees.
ConclusionsThis evaluation study allowed us to refine the EsteR toolkit. Using the sensitivity analysis, we identified suitable model parameters and analyzed how stable the statistical models were in terms of changes in their parameters. Furthermore, the front end of the web application was improved with the results of the conducted cognitive walk-throughs and focus group interviews regarding its user-friendliness. |
first_indexed | 2024-03-12T12:36:55Z |
format | Article |
id | doaj.art-06a98d36d81b48efb072628784245c9e |
institution | Directory Open Access Journal |
issn | 2561-326X |
language | English |
last_indexed | 2024-03-12T12:36:55Z |
publishDate | 2023-06-01 |
publisher | JMIR Publications |
record_format | Article |
series | JMIR Formative Research |
spelling | doaj.art-06a98d36d81b48efb072628784245c9e2023-08-29T00:07:16ZengJMIR PublicationsJMIR Formative Research2561-326X2023-06-017e4454910.2196/44549Evaluation of the EsteR Toolkit for COVID-19 Decision Support: Sensitivity Analysis and Usability StudyRieke Alpershttps://orcid.org/0000-0001-8317-1435Lisa Kühnehttps://orcid.org/0000-0002-7103-7658Hong-Phuc Truonghttps://orcid.org/0000-0001-6495-053XHajo Zeebhttps://orcid.org/0000-0001-7509-242XMax Westphalhttps://orcid.org/0000-0002-8488-758XSonja Jäcklehttps://orcid.org/0000-0002-2908-299X BackgroundDuring the COVID-19 pandemic, local health authorities were responsible for managing and reporting current cases in Germany. Since March 2020, employees had to contain the spread of COVID-19 by monitoring and contacting infected persons as well as tracing their contacts. In the EsteR project, we implemented existing and newly developed statistical models as decision support tools to assist in the work of the local health authorities. ObjectiveThe main goal of this study was to validate the EsteR toolkit in two complementary ways: first, investigating the stability of the answers provided by our statistical tools regarding model parameters in the back end and, second, evaluating the usability and applicability of our web application in the front end by test users. MethodsFor model stability assessment, a sensitivity analysis was carried out for all 5 developed statistical models. The default parameters of our models as well as the test ranges of the model parameters were based on a previous literature review on COVID-19 properties. The obtained answers resulting from different parameters were compared using dissimilarity metrics and visualized using contour plots. In addition, the parameter ranges of general model stability were identified. For the usability evaluation of the web application, cognitive walk-throughs and focus group interviews were conducted with 6 containment scouts located at 2 different local health authorities. They were first asked to complete small tasks with the tools and then express their general impressions of the web application. ResultsThe simulation results showed that some statistical models were more sensitive to changes in their parameters than others. For each of the single-person use cases, we determined an area where the respective model could be rated as stable. In contrast, the results of the group use cases highly depended on the user inputs, and thus, no area of parameters with general model stability could be identified. We have also provided a detailed simulation report of the sensitivity analysis. In the user evaluation, the cognitive walk-throughs and focus group interviews revealed that the user interface needed to be simplified and more information was necessary as guidance. In general, the testers rated the web application as helpful, especially for new employees. ConclusionsThis evaluation study allowed us to refine the EsteR toolkit. Using the sensitivity analysis, we identified suitable model parameters and analyzed how stable the statistical models were in terms of changes in their parameters. Furthermore, the front end of the web application was improved with the results of the conducted cognitive walk-throughs and focus group interviews regarding its user-friendliness.https://formative.jmir.org/2023/1/e44549 |
spellingShingle | Rieke Alpers Lisa Kühne Hong-Phuc Truong Hajo Zeeb Max Westphal Sonja Jäckle Evaluation of the EsteR Toolkit for COVID-19 Decision Support: Sensitivity Analysis and Usability Study JMIR Formative Research |
title | Evaluation of the EsteR Toolkit for COVID-19 Decision Support: Sensitivity Analysis and Usability Study |
title_full | Evaluation of the EsteR Toolkit for COVID-19 Decision Support: Sensitivity Analysis and Usability Study |
title_fullStr | Evaluation of the EsteR Toolkit for COVID-19 Decision Support: Sensitivity Analysis and Usability Study |
title_full_unstemmed | Evaluation of the EsteR Toolkit for COVID-19 Decision Support: Sensitivity Analysis and Usability Study |
title_short | Evaluation of the EsteR Toolkit for COVID-19 Decision Support: Sensitivity Analysis and Usability Study |
title_sort | evaluation of the ester toolkit for covid 19 decision support sensitivity analysis and usability study |
url | https://formative.jmir.org/2023/1/e44549 |
work_keys_str_mv | AT riekealpers evaluationoftheestertoolkitforcovid19decisionsupportsensitivityanalysisandusabilitystudy AT lisakuhne evaluationoftheestertoolkitforcovid19decisionsupportsensitivityanalysisandusabilitystudy AT hongphuctruong evaluationoftheestertoolkitforcovid19decisionsupportsensitivityanalysisandusabilitystudy AT hajozeeb evaluationoftheestertoolkitforcovid19decisionsupportsensitivityanalysisandusabilitystudy AT maxwestphal evaluationoftheestertoolkitforcovid19decisionsupportsensitivityanalysisandusabilitystudy AT sonjajackle evaluationoftheestertoolkitforcovid19decisionsupportsensitivityanalysisandusabilitystudy |