Predictive privacy: Collective data protection in the context of artificial intelligence and big data

Big data and artificial intelligence pose a new challenge for data protection as these techniques allow predictions to be made about third parties based on the anonymous data of many people. Examples of predicted information include purchasing power, gender, age, health, sexual orientation, ethnicit...

Full description

Bibliographic Details
Main Author: Rainer Mühlhoff
Format: Article
Language:English
Published: SAGE Publishing 2023-01-01
Series:Big Data & Society
Online Access:https://doi.org/10.1177/20539517231166886
_version_ 1797845418180083712
author Rainer Mühlhoff
author_facet Rainer Mühlhoff
author_sort Rainer Mühlhoff
collection DOAJ
description Big data and artificial intelligence pose a new challenge for data protection as these techniques allow predictions to be made about third parties based on the anonymous data of many people. Examples of predicted information include purchasing power, gender, age, health, sexual orientation, ethnicity, etc. The basis for such applications of “predictive analytics” is the comparison between behavioral data (e.g. usage, tracking, or activity data) of the individual in question and the potentially anonymously processed data of many others using machine learning models or simpler statistical methods. The article starts by noting that predictive analytics has a significant potential to be abused, which manifests itself in the form of social inequality, discrimination, and exclusion. These potentials are not regulated by current data protection law in the EU; indeed, the use of anonymized mass data takes place in a largely unregulated space. Under the term “predictive privacy,” a data protection approach is presented that counters the risks of abuse of predictive analytics. A person's predictive privacy is violated when personal information about them is predicted without their knowledge and against their will based on the data of many other people. Predictive privacy is then formulated as a protected good and improvements to data protection with regard to the regulation of predictive analytics are proposed. Finally, the article points out that the goal of data protection in the context of predictive analytics is the regulation of “prediction power,” which is a new manifestation of informational power asymmetry between platform companies and society.
first_indexed 2024-04-09T17:38:42Z
format Article
id doaj.art-9f829170f87647dd9c3db129dccf1e36
institution Directory Open Access Journal
issn 2053-9517
language English
last_indexed 2024-04-09T17:38:42Z
publishDate 2023-01-01
publisher SAGE Publishing
record_format Article
series Big Data & Society
spelling doaj.art-9f829170f87647dd9c3db129dccf1e362023-04-17T10:03:49ZengSAGE PublishingBig Data & Society2053-95172023-01-011010.1177/20539517231166886Predictive privacy: Collective data protection in the context of artificial intelligence and big dataRainer MühlhoffBig data and artificial intelligence pose a new challenge for data protection as these techniques allow predictions to be made about third parties based on the anonymous data of many people. Examples of predicted information include purchasing power, gender, age, health, sexual orientation, ethnicity, etc. The basis for such applications of “predictive analytics” is the comparison between behavioral data (e.g. usage, tracking, or activity data) of the individual in question and the potentially anonymously processed data of many others using machine learning models or simpler statistical methods. The article starts by noting that predictive analytics has a significant potential to be abused, which manifests itself in the form of social inequality, discrimination, and exclusion. These potentials are not regulated by current data protection law in the EU; indeed, the use of anonymized mass data takes place in a largely unregulated space. Under the term “predictive privacy,” a data protection approach is presented that counters the risks of abuse of predictive analytics. A person's predictive privacy is violated when personal information about them is predicted without their knowledge and against their will based on the data of many other people. Predictive privacy is then formulated as a protected good and improvements to data protection with regard to the regulation of predictive analytics are proposed. Finally, the article points out that the goal of data protection in the context of predictive analytics is the regulation of “prediction power,” which is a new manifestation of informational power asymmetry between platform companies and society.https://doi.org/10.1177/20539517231166886
spellingShingle Rainer Mühlhoff
Predictive privacy: Collective data protection in the context of artificial intelligence and big data
Big Data & Society
title Predictive privacy: Collective data protection in the context of artificial intelligence and big data
title_full Predictive privacy: Collective data protection in the context of artificial intelligence and big data
title_fullStr Predictive privacy: Collective data protection in the context of artificial intelligence and big data
title_full_unstemmed Predictive privacy: Collective data protection in the context of artificial intelligence and big data
title_short Predictive privacy: Collective data protection in the context of artificial intelligence and big data
title_sort predictive privacy collective data protection in the context of artificial intelligence and big data
url https://doi.org/10.1177/20539517231166886
work_keys_str_mv AT rainermuhlhoff predictiveprivacycollectivedataprotectioninthecontextofartificialintelligenceandbigdata