Sources of bias in artificial intelligence that perpetuate healthcare disparities—A global review
<h4>Background</h4> While artificial intelligence (AI) offers possibilities of advanced clinical prediction and decision-making in healthcare, models trained on relatively homogeneous datasets, and populations poorly-representative of underlying diversity, limits generalisability and ris...
Main Authors: | Leo Anthony Celi, Jacqueline Cellini, Marie-Laure Charpignon, Edward Christopher Dee, Franck Dernoncourt, Rene Eber, William Greig Mitchell, Lama Moukheiber, Julian Schirmer, Julia Situ, Joseph Paguio, Joel Park, Judy Gichoya Wawira, Seth Yao |
---|---|
Format: | Article |
Language: | English |
Published: |
Public Library of Science (PLoS)
2022-03-01
|
Series: | PLOS Digital Health |
Online Access: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9931338/?tool=EBI |
Similar Items
-
Sources of bias in artificial intelligence that perpetuate healthcare disparities—A global review
by: Celi, Leo Anthony, et al.
Published: (2022) -
Sources of bias in artificial intelligence that perpetuate healthcare disparities-A global review.
by: Leo Anthony Celi, et al.
Published: (2022-03-01) -
Village mentoring and hive learning: The MIT Critical Data experience
by: Christopher V. Cosgriff, et al.
Published: (2021-06-01) -
Equity should be fundamental to the emergence of innovation
by: Jack Gallifant, et al.
Published: (2023-04-01) -
Equity in essence: a call for operationalising fairness in machine learning for healthcare
by: Wawira Gichoya, Judy, et al.
Published: (2021)