<span style="font-variant: small-caps">FairCaipi</span>: A Combination of Explanatory Interactive and Fair Machine Learning for Human and Machine Bias Reduction

The rise of machine-learning applications in domains with critical end-user impact has led to a growing concern about the fairness of learned models, with the goal of avoiding biases that negatively impact specific demographic groups. Most existing bias-mitigation strategies adapt the importance of...

Full description

Bibliographic Details
Main Authors: Louisa Heidrich, Emanuel Slany, Stephan Scheele, Ute Schmid
Format: Article
Language:English
Published: MDPI AG 2023-10-01
Series:Machine Learning and Knowledge Extraction
Subjects:
Online Access:https://www.mdpi.com/2504-4990/5/4/76
Description
Summary:The rise of machine-learning applications in domains with critical end-user impact has led to a growing concern about the fairness of learned models, with the goal of avoiding biases that negatively impact specific demographic groups. Most existing bias-mitigation strategies adapt the importance of data instances during pre-processing. Since fairness is a contextual concept, we advocate for an interactive machine-learning approach that enables users to provide iterative feedback for model adaptation. Specifically, we propose to adapt the explanatory interactive machine-learning approach <span style="font-variant: small-caps;">Caipi</span> for fair machine learning. <span style="font-variant: small-caps;">FairCaipi</span> incorporates human feedback in the loop on predictions and explanations to improve the fairness of the model. Experimental results demonstrate that <span style="font-variant: small-caps;">FairCaipi</span> outperforms a state-of-the-art pre-processing bias mitigation strategy in terms of the fairness and the predictive performance of the resulting machine-learning model. We show that <span style="font-variant: small-caps;">FairCaipi</span> can both uncover and reduce bias in machine-learning models and allows us to detect human bias.
ISSN:2504-4990