Second opinion needed: communicating uncertainty in medical machine learning

Abstract There is great excitement that medical artificial intelligence (AI) based on machine learning (ML) can be used to improve decision making at the patient level in a variety of healthcare settings. However, the quantification and communication of uncertainty for individual predictions is ofte...

Full description

Bibliographic Details
Main Authors: Benjamin Kompa, Jasper Snoek, Andrew L. Beam
Format: Article
Language:English
Published: Nature Portfolio 2021-01-01
Series:npj Digital Medicine
Online Access:https://doi.org/10.1038/s41746-020-00367-3
_version_ 1797641578284580864
author Benjamin Kompa
Jasper Snoek
Andrew L. Beam
author_facet Benjamin Kompa
Jasper Snoek
Andrew L. Beam
author_sort Benjamin Kompa
collection DOAJ
description Abstract There is great excitement that medical artificial intelligence (AI) based on machine learning (ML) can be used to improve decision making at the patient level in a variety of healthcare settings. However, the quantification and communication of uncertainty for individual predictions is often neglected even though uncertainty estimates could lead to more principled decision-making and enable machine learning models to automatically or semi-automatically abstain on samples for which there is high uncertainty. In this article, we provide an overview of different approaches to uncertainty quantification and abstention for machine learning and highlight how these techniques could improve the safety and reliability of current ML systems being used in healthcare settings. Effective quantification and communication of uncertainty could help to engender trust with healthcare workers, while providing safeguards against known failure modes of current machine learning approaches. As machine learning becomes further integrated into healthcare environments, the ability to say “I’m not sure” or “I don’t know” when uncertain is a necessary capability to enable safe clinical deployment.
first_indexed 2024-03-11T13:47:41Z
format Article
id doaj.art-d4f85d7dd17c413e9a36056423ac1ba9
institution Directory Open Access Journal
issn 2398-6352
language English
last_indexed 2024-03-11T13:47:41Z
publishDate 2021-01-01
publisher Nature Portfolio
record_format Article
series npj Digital Medicine
spelling doaj.art-d4f85d7dd17c413e9a36056423ac1ba92023-11-02T10:05:39ZengNature Portfolionpj Digital Medicine2398-63522021-01-01411610.1038/s41746-020-00367-3Second opinion needed: communicating uncertainty in medical machine learningBenjamin Kompa0Jasper Snoek1Andrew L. Beam2Department of Biomedical Informatics, Harvard Medical SchoolGoogle BrainDepartment of Biomedical Informatics, Harvard Medical SchoolAbstract There is great excitement that medical artificial intelligence (AI) based on machine learning (ML) can be used to improve decision making at the patient level in a variety of healthcare settings. However, the quantification and communication of uncertainty for individual predictions is often neglected even though uncertainty estimates could lead to more principled decision-making and enable machine learning models to automatically or semi-automatically abstain on samples for which there is high uncertainty. In this article, we provide an overview of different approaches to uncertainty quantification and abstention for machine learning and highlight how these techniques could improve the safety and reliability of current ML systems being used in healthcare settings. Effective quantification and communication of uncertainty could help to engender trust with healthcare workers, while providing safeguards against known failure modes of current machine learning approaches. As machine learning becomes further integrated into healthcare environments, the ability to say “I’m not sure” or “I don’t know” when uncertain is a necessary capability to enable safe clinical deployment.https://doi.org/10.1038/s41746-020-00367-3
spellingShingle Benjamin Kompa
Jasper Snoek
Andrew L. Beam
Second opinion needed: communicating uncertainty in medical machine learning
npj Digital Medicine
title Second opinion needed: communicating uncertainty in medical machine learning
title_full Second opinion needed: communicating uncertainty in medical machine learning
title_fullStr Second opinion needed: communicating uncertainty in medical machine learning
title_full_unstemmed Second opinion needed: communicating uncertainty in medical machine learning
title_short Second opinion needed: communicating uncertainty in medical machine learning
title_sort second opinion needed communicating uncertainty in medical machine learning
url https://doi.org/10.1038/s41746-020-00367-3
work_keys_str_mv AT benjaminkompa secondopinionneededcommunicatinguncertaintyinmedicalmachinelearning
AT jaspersnoek secondopinionneededcommunicatinguncertaintyinmedicalmachinelearning
AT andrewlbeam secondopinionneededcommunicatinguncertaintyinmedicalmachinelearning