A general framework for fair regression
Fairness, through its many forms and definitions, has become an important issue facing the machine learning community. In this work, we consider how to incorporate group fairness constraints into kernel regression methods, applicable to Gaussian processes, support vector machines, neural network reg...
Main Authors: | , , , |
---|---|
Format: | Journal article |
Language: | English |
Published: |
MDPI
2019
|
_version_ | 1797083567494266880 |
---|---|
author | Fitzsimons, J Ali, AA Osborne, M Roberts, S |
author_facet | Fitzsimons, J Ali, AA Osborne, M Roberts, S |
author_sort | Fitzsimons, J |
collection | OXFORD |
description | Fairness, through its many forms and definitions, has become an important issue facing the machine learning community. In this work, we consider how to incorporate group fairness constraints into kernel regression methods, applicable to Gaussian processes, support vector machines, neural network regression and decision tree regression. Further, we focus on examining the effect of incorporating these constraints in decision tree regression, with direct applications to random forests and boosted trees amongst other widespread popular inference techniques. We show that the order of complexity of memory and computation is preserved for such models and tightly binds the expected perturbations to the model in terms of the number of leaves of the trees. Importantly, the approach works on trained models and hence can be easily applied to models in current use and group labels are only required on training data. |
first_indexed | 2024-03-07T01:43:18Z |
format | Journal article |
id | oxford-uuid:97901596-7500-4bfd-a40b-bfad27d6aad2 |
institution | University of Oxford |
language | English |
last_indexed | 2024-03-07T01:43:18Z |
publishDate | 2019 |
publisher | MDPI |
record_format | dspace |
spelling | oxford-uuid:97901596-7500-4bfd-a40b-bfad27d6aad22022-03-27T00:00:44ZA general framework for fair regressionJournal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:97901596-7500-4bfd-a40b-bfad27d6aad2EnglishSymplectic ElementsMDPI2019Fitzsimons, JAli, AAOsborne, MRoberts, SFairness, through its many forms and definitions, has become an important issue facing the machine learning community. In this work, we consider how to incorporate group fairness constraints into kernel regression methods, applicable to Gaussian processes, support vector machines, neural network regression and decision tree regression. Further, we focus on examining the effect of incorporating these constraints in decision tree regression, with direct applications to random forests and boosted trees amongst other widespread popular inference techniques. We show that the order of complexity of memory and computation is preserved for such models and tightly binds the expected perturbations to the model in terms of the number of leaves of the trees. Importantly, the approach works on trained models and hence can be easily applied to models in current use and group labels are only required on training data. |
spellingShingle | Fitzsimons, J Ali, AA Osborne, M Roberts, S A general framework for fair regression |
title | A general framework for fair regression |
title_full | A general framework for fair regression |
title_fullStr | A general framework for fair regression |
title_full_unstemmed | A general framework for fair regression |
title_short | A general framework for fair regression |
title_sort | general framework for fair regression |
work_keys_str_mv | AT fitzsimonsj ageneralframeworkforfairregression AT aliaa ageneralframeworkforfairregression AT osbornem ageneralframeworkforfairregression AT robertss ageneralframeworkforfairregression AT fitzsimonsj generalframeworkforfairregression AT aliaa generalframeworkforfairregression AT osbornem generalframeworkforfairregression AT robertss generalframeworkforfairregression |