Racial, skin tone, and sex disparities in automated proctoring software
Students of color, particularly women of color, face substantial barriers in STEM disciplines in higher education due to social isolation and interpersonal, technological, and institutional biases. For example, online exam proctoring software often uses facial detection technology to identify potent...
Main Authors: | , , , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2022-09-01
|
Series: | Frontiers in Education |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/feduc.2022.881449/full |
_version_ | 1798035745615642624 |
---|---|
author | Deborah R. Yoder-Himes Alina Asif Kaelin Kinney Tiffany J. Brandt Rhiannon E. Cecil Paul R. Himes Cara Cashon Rachel M. P. Hopp Edna Ross |
author_facet | Deborah R. Yoder-Himes Alina Asif Kaelin Kinney Tiffany J. Brandt Rhiannon E. Cecil Paul R. Himes Cara Cashon Rachel M. P. Hopp Edna Ross |
author_sort | Deborah R. Yoder-Himes |
collection | DOAJ |
description | Students of color, particularly women of color, face substantial barriers in STEM disciplines in higher education due to social isolation and interpersonal, technological, and institutional biases. For example, online exam proctoring software often uses facial detection technology to identify potential cheating behaviors. Undetected faces often result in flagging and notifying instructors of these as “suspicious” instances needing manual review. However, facial detection algorithms employed by exam proctoring software may be biased against students with certain skin tones or genders depending on the images employed by each company as training sets. This phenomenon has not yet been quantified nor is it readily accessible from the companies that make this type of software. To determine if the automated proctoring software adopted at our institution and which is used by at least 1,500 universities nationally, suffered from a racial, skin tone, or gender bias, the instructor outputs from ∼357 students from four courses were examined. Student data from one exam in each course was collected, a high-resolution photograph was used to manually categorize skin tone, and the self-reported race and sex for each student was obtained. The likelihood that any groups of students were flagged more frequently for potential cheating was examined. The results of this study showed a significant increase in likelihood that students with darker skin tones and Black students would be marked as more in need of instructor review due to potential cheating. Interestingly, there were no significant differences between male and female students when considered in aggregate but, when examined for intersectional differences, women with the darkest skin tones were far more likely than darker skin males or lighter skin males and females to be flagged for review. Together, these results suggest that a major automated proctoring software may employ biased AI algorithms that unfairly disadvantage students. This study is novel as it is the first to quantitatively examine biases in facial detection software at the intersection of race and sex and it has potential impacts in many areas of education, social justice, education equity and diversity, and psychology. |
first_indexed | 2024-04-11T21:02:29Z |
format | Article |
id | doaj.art-f8665b57a8df401fb560691a2e1499cc |
institution | Directory Open Access Journal |
issn | 2504-284X |
language | English |
last_indexed | 2024-04-11T21:02:29Z |
publishDate | 2022-09-01 |
publisher | Frontiers Media S.A. |
record_format | Article |
series | Frontiers in Education |
spelling | doaj.art-f8665b57a8df401fb560691a2e1499cc2022-12-22T04:03:27ZengFrontiers Media S.A.Frontiers in Education2504-284X2022-09-01710.3389/feduc.2022.881449881449Racial, skin tone, and sex disparities in automated proctoring softwareDeborah R. Yoder-Himes0Alina Asif1Kaelin Kinney2Tiffany J. Brandt3Rhiannon E. Cecil4Paul R. Himes5Cara Cashon6Rachel M. P. Hopp7Edna Ross8Department of Biology, University of Louisville, Louisville, KY, United StatesDepartment of Biology, University of Louisville, Louisville, KY, United StatesDepartment of Psychology and Brain Sciences, University of Louisville, Louisville, KY, United StatesDepartment of Biology, University of Louisville, Louisville, KY, United StatesDepartment of Biology, University of Louisville, Louisville, KY, United StatesDepartment of Biology, University of Louisville, Louisville, KY, United StatesDepartment of Psychology and Brain Sciences, University of Louisville, Louisville, KY, United StatesDepartment of Biology, University of Louisville, Louisville, KY, United StatesDepartment of Psychology and Brain Sciences, University of Louisville, Louisville, KY, United StatesStudents of color, particularly women of color, face substantial barriers in STEM disciplines in higher education due to social isolation and interpersonal, technological, and institutional biases. For example, online exam proctoring software often uses facial detection technology to identify potential cheating behaviors. Undetected faces often result in flagging and notifying instructors of these as “suspicious” instances needing manual review. However, facial detection algorithms employed by exam proctoring software may be biased against students with certain skin tones or genders depending on the images employed by each company as training sets. This phenomenon has not yet been quantified nor is it readily accessible from the companies that make this type of software. To determine if the automated proctoring software adopted at our institution and which is used by at least 1,500 universities nationally, suffered from a racial, skin tone, or gender bias, the instructor outputs from ∼357 students from four courses were examined. Student data from one exam in each course was collected, a high-resolution photograph was used to manually categorize skin tone, and the self-reported race and sex for each student was obtained. The likelihood that any groups of students were flagged more frequently for potential cheating was examined. The results of this study showed a significant increase in likelihood that students with darker skin tones and Black students would be marked as more in need of instructor review due to potential cheating. Interestingly, there were no significant differences between male and female students when considered in aggregate but, when examined for intersectional differences, women with the darkest skin tones were far more likely than darker skin males or lighter skin males and females to be flagged for review. Together, these results suggest that a major automated proctoring software may employ biased AI algorithms that unfairly disadvantage students. This study is novel as it is the first to quantitatively examine biases in facial detection software at the intersection of race and sex and it has potential impacts in many areas of education, social justice, education equity and diversity, and psychology.https://www.frontiersin.org/articles/10.3389/feduc.2022.881449/fullalgorithmic biastest proctoringskin tonefacial detectionsoftware |
spellingShingle | Deborah R. Yoder-Himes Alina Asif Kaelin Kinney Tiffany J. Brandt Rhiannon E. Cecil Paul R. Himes Cara Cashon Rachel M. P. Hopp Edna Ross Racial, skin tone, and sex disparities in automated proctoring software Frontiers in Education algorithmic bias test proctoring skin tone facial detection software |
title | Racial, skin tone, and sex disparities in automated proctoring software |
title_full | Racial, skin tone, and sex disparities in automated proctoring software |
title_fullStr | Racial, skin tone, and sex disparities in automated proctoring software |
title_full_unstemmed | Racial, skin tone, and sex disparities in automated proctoring software |
title_short | Racial, skin tone, and sex disparities in automated proctoring software |
title_sort | racial skin tone and sex disparities in automated proctoring software |
topic | algorithmic bias test proctoring skin tone facial detection software |
url | https://www.frontiersin.org/articles/10.3389/feduc.2022.881449/full |
work_keys_str_mv | AT deborahryoderhimes racialskintoneandsexdisparitiesinautomatedproctoringsoftware AT alinaasif racialskintoneandsexdisparitiesinautomatedproctoringsoftware AT kaelinkinney racialskintoneandsexdisparitiesinautomatedproctoringsoftware AT tiffanyjbrandt racialskintoneandsexdisparitiesinautomatedproctoringsoftware AT rhiannonececil racialskintoneandsexdisparitiesinautomatedproctoringsoftware AT paulrhimes racialskintoneandsexdisparitiesinautomatedproctoringsoftware AT caracashon racialskintoneandsexdisparitiesinautomatedproctoringsoftware AT rachelmphopp racialskintoneandsexdisparitiesinautomatedproctoringsoftware AT ednaross racialskintoneandsexdisparitiesinautomatedproctoringsoftware |