Why is my classifier discriminatory?

© 2018 Curran Associates Inc..All rights reserved. Recent attempts to achieve fairness in predictive models focus on the balance between fairness and accuracy. In sensitive applications such as healthcare or criminal justice, this trade-off is often undesirable as any increase in prediction error co...

Full description

Bibliographic Details
Main Authors: Sontag, David, Johansson, Fredrik D.
Other Authors: Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Format: Article
Language:English
Published: 2021
Online Access:https://hdl.handle.net/1721.1/137319
_version_ 1826195164879650816
author Sontag, David
Johansson, Fredrik D.
author2 Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
author_facet Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Sontag, David
Johansson, Fredrik D.
author_sort Sontag, David
collection MIT
description © 2018 Curran Associates Inc..All rights reserved. Recent attempts to achieve fairness in predictive models focus on the balance between fairness and accuracy. In sensitive applications such as healthcare or criminal justice, this trade-off is often undesirable as any increase in prediction error could have devastating consequences. In this work, we argue that the fairness of predictions should be evaluated in context of the data, and that unfairness induced by inadequate samples sizes or unmeasured predictive variables should be addressed through data collection, rather than by constraining the model. We decompose cost-based metrics of discrimination into bias, variance, and noise, and propose actions aimed at estimating and reducing each term. Finally, we perform case-studies on prediction of income, mortality, and review ratings, confirming the value of this analysis. We find that data collection is often a means to reduce discrimination without sacrificing accuracy.
first_indexed 2024-09-23T10:08:41Z
format Article
id mit-1721.1/137319
institution Massachusetts Institute of Technology
language English
last_indexed 2024-09-23T10:08:41Z
publishDate 2021
record_format dspace
spelling mit-1721.1/1373192023-02-08T21:25:47Z Why is my classifier discriminatory? Sontag, David Johansson, Fredrik D. Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory © 2018 Curran Associates Inc..All rights reserved. Recent attempts to achieve fairness in predictive models focus on the balance between fairness and accuracy. In sensitive applications such as healthcare or criminal justice, this trade-off is often undesirable as any increase in prediction error could have devastating consequences. In this work, we argue that the fairness of predictions should be evaluated in context of the data, and that unfairness induced by inadequate samples sizes or unmeasured predictive variables should be addressed through data collection, rather than by constraining the model. We decompose cost-based metrics of discrimination into bias, variance, and noise, and propose actions aimed at estimating and reducing each term. Finally, we perform case-studies on prediction of income, mortality, and review ratings, confirming the value of this analysis. We find that data collection is often a means to reduce discrimination without sacrificing accuracy. 2021-11-04T11:57:10Z 2021-11-04T11:57:10Z 2018 2021-03-30T14:52:16Z Article http://purl.org/eprint/type/ConferencePaper https://hdl.handle.net/1721.1/137319 Sontag, David and Johansson, Fredrik D. 2018. "Why is my classifier discriminatory?." Advances in Neural Information Processing Systems, 2018-December. en https://papers.nips.cc/paper/2018/hash/1f1baa5b8edac74eb4eaa329f14a0361-Abstract.html Advances in Neural Information Processing Systems Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use. application/pdf Neural Information Processing Systems (NIPS)
spellingShingle Sontag, David
Johansson, Fredrik D.
Why is my classifier discriminatory?
title Why is my classifier discriminatory?
title_full Why is my classifier discriminatory?
title_fullStr Why is my classifier discriminatory?
title_full_unstemmed Why is my classifier discriminatory?
title_short Why is my classifier discriminatory?
title_sort why is my classifier discriminatory
url https://hdl.handle.net/1721.1/137319
work_keys_str_mv AT sontagdavid whyismyclassifierdiscriminatory
AT johanssonfredrikd whyismyclassifierdiscriminatory