Gender Bias When Using Artificial Intelligence to Assess Anorexia Nervosa on Social Media: Data-Driven Study

BackgroundSocial media sites are becoming an increasingly important source of information about mental health disorders. Among them, eating disorders are complex psychological problems that involve unhealthy eating habits. In particular, there is evidence showing that signs a...

Full description

Bibliographic Details
Main Authors: David Solans Noguero, Diana Ramírez-Cifuentes, Esteban Andrés Ríssola, Ana Freire
Format: Article
Language:English
Published: JMIR Publications 2023-06-01
Series:Journal of Medical Internet Research
Online Access:https://www.jmir.org/2023/1/e45184
_version_ 1797734069572730880
author David Solans Noguero
Diana Ramírez-Cifuentes
Esteban Andrés Ríssola
Ana Freire
author_facet David Solans Noguero
Diana Ramírez-Cifuentes
Esteban Andrés Ríssola
Ana Freire
author_sort David Solans Noguero
collection DOAJ
description BackgroundSocial media sites are becoming an increasingly important source of information about mental health disorders. Among them, eating disorders are complex psychological problems that involve unhealthy eating habits. In particular, there is evidence showing that signs and symptoms of anorexia nervosa can be traced in social media platforms. Knowing that input data biases tend to be amplified by artificial intelligence algorithms and, in particular, machine learning, these methods should be revised to mitigate biased discrimination in such important domains. ObjectiveThe main goal of this study was to detect and analyze the performance disparities across genders in algorithms trained for the detection of anorexia nervosa on social media posts. We used a collection of automated predictors trained on a data set in Spanish containing cases of 177 users that showed signs of anorexia (471,262 tweets) and 326 control cases (910,967 tweets). MethodsWe first inspected the predictive performance differences between the algorithms for male and female users. Once biases were detected, we applied a feature-level bias characterization to evaluate the source of such biases and performed a comparative analysis of such features and those that are relevant for clinicians. Finally, we showcased different bias mitigation strategies to develop fairer automated classifiers, particularly for risk assessment in sensitive domains. ResultsOur results revealed concerning predictive performance differences, with substantially higher false negative rates (FNRs) for female samples (FNR=0.082) compared with male samples (FNR=0.005). The findings show that biological processes and suicide risk factors were relevant for classifying positive male cases, whereas age, emotions, and personal concerns were more relevant for female cases. We also proposed techniques for bias mitigation, and we could see that, even though disparities can be mitigated, they cannot be eliminated. ConclusionsWe concluded that more attention should be paid to the assessment of biases in automated methods dedicated to the detection of mental health issues. This is particularly relevant before the deployment of systems that are thought to assist clinicians, especially considering that the outputs of such systems can have an impact on the diagnosis of people at risk.
first_indexed 2024-03-12T12:38:55Z
format Article
id doaj.art-f9ac4b93f68743d3b3a0a2ba1b390a5c
institution Directory Open Access Journal
issn 1438-8871
language English
last_indexed 2024-03-12T12:38:55Z
publishDate 2023-06-01
publisher JMIR Publications
record_format Article
series Journal of Medical Internet Research
spelling doaj.art-f9ac4b93f68743d3b3a0a2ba1b390a5c2023-08-29T00:02:12ZengJMIR PublicationsJournal of Medical Internet Research1438-88712023-06-0125e4518410.2196/45184Gender Bias When Using Artificial Intelligence to Assess Anorexia Nervosa on Social Media: Data-Driven StudyDavid Solans Noguerohttps://orcid.org/0000-0001-6979-9330Diana Ramírez-Cifuenteshttps://orcid.org/0000-0003-0742-4773Esteban Andrés Ríssolahttps://orcid.org/0000-0001-7072-4096Ana Freirehttps://orcid.org/0000-0003-4698-2129 BackgroundSocial media sites are becoming an increasingly important source of information about mental health disorders. Among them, eating disorders are complex psychological problems that involve unhealthy eating habits. In particular, there is evidence showing that signs and symptoms of anorexia nervosa can be traced in social media platforms. Knowing that input data biases tend to be amplified by artificial intelligence algorithms and, in particular, machine learning, these methods should be revised to mitigate biased discrimination in such important domains. ObjectiveThe main goal of this study was to detect and analyze the performance disparities across genders in algorithms trained for the detection of anorexia nervosa on social media posts. We used a collection of automated predictors trained on a data set in Spanish containing cases of 177 users that showed signs of anorexia (471,262 tweets) and 326 control cases (910,967 tweets). MethodsWe first inspected the predictive performance differences between the algorithms for male and female users. Once biases were detected, we applied a feature-level bias characterization to evaluate the source of such biases and performed a comparative analysis of such features and those that are relevant for clinicians. Finally, we showcased different bias mitigation strategies to develop fairer automated classifiers, particularly for risk assessment in sensitive domains. ResultsOur results revealed concerning predictive performance differences, with substantially higher false negative rates (FNRs) for female samples (FNR=0.082) compared with male samples (FNR=0.005). The findings show that biological processes and suicide risk factors were relevant for classifying positive male cases, whereas age, emotions, and personal concerns were more relevant for female cases. We also proposed techniques for bias mitigation, and we could see that, even though disparities can be mitigated, they cannot be eliminated. ConclusionsWe concluded that more attention should be paid to the assessment of biases in automated methods dedicated to the detection of mental health issues. This is particularly relevant before the deployment of systems that are thought to assist clinicians, especially considering that the outputs of such systems can have an impact on the diagnosis of people at risk.https://www.jmir.org/2023/1/e45184
spellingShingle David Solans Noguero
Diana Ramírez-Cifuentes
Esteban Andrés Ríssola
Ana Freire
Gender Bias When Using Artificial Intelligence to Assess Anorexia Nervosa on Social Media: Data-Driven Study
Journal of Medical Internet Research
title Gender Bias When Using Artificial Intelligence to Assess Anorexia Nervosa on Social Media: Data-Driven Study
title_full Gender Bias When Using Artificial Intelligence to Assess Anorexia Nervosa on Social Media: Data-Driven Study
title_fullStr Gender Bias When Using Artificial Intelligence to Assess Anorexia Nervosa on Social Media: Data-Driven Study
title_full_unstemmed Gender Bias When Using Artificial Intelligence to Assess Anorexia Nervosa on Social Media: Data-Driven Study
title_short Gender Bias When Using Artificial Intelligence to Assess Anorexia Nervosa on Social Media: Data-Driven Study
title_sort gender bias when using artificial intelligence to assess anorexia nervosa on social media data driven study
url https://www.jmir.org/2023/1/e45184
work_keys_str_mv AT davidsolansnoguero genderbiaswhenusingartificialintelligencetoassessanorexianervosaonsocialmediadatadrivenstudy
AT dianaramirezcifuentes genderbiaswhenusingartificialintelligencetoassessanorexianervosaonsocialmediadatadrivenstudy
AT estebanandresrissola genderbiaswhenusingartificialintelligencetoassessanorexianervosaonsocialmediadatadrivenstudy
AT anafreire genderbiaswhenusingartificialintelligencetoassessanorexianervosaonsocialmediadatadrivenstudy