Measuring Imbalance on Intersectional Protected Attributes and on Target Variable to Forecast Unfair Classifications
Bias in software systems is a serious threat to human rights: when software makes decisions that allocate resources or opportunities, may disparately impact people based on personal traits (e.g., gender, ethnic group, etc.), systematically (dis)advantaging certain social groups. The cause is very of...
Main Authors: | Mariachiara Mecati, Marco Torchiano, Antonio Vetro, Juan Carlos de Martin |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2023-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10058507/ |
Similar Items
-
Solving Data Imbalance in Text Classification With Constructing Contrastive Samples
by: Xi Chen, et al.
Published: (2023-01-01) -
Controlling Bias Between Categorical Attributes in Datasets: A Two-Step Optimization Algorithm Leveraging Structural Equation Modeling
by: Enrico Barbierato, et al.
Published: (2023-01-01) -
A step toward building a unified framework for managing AI bias
by: Saadia Afzal Rana, et al.
Published: (2023-10-01) -
FAWOS: Fairness-Aware Oversampling Algorithm Based on Distributions of Sensitive Attributes
by: Teresa Salazar, et al.
Published: (2021-01-01) -
Exoskeletons for all: The interplay between exoskeletons, inclusion, gender, and intersectionality
by: Søraa Roger Andre, et al.
Published: (2020-05-01)