Down by algorithms? Siphoning rents, exploiting biases, and shaping preferences: Regulating the dark side of personalized transactions

In this Essay, we seek to systematically explore and understand crucial aspects of the dark side of personalized business to consumer (B2C) transactions. We identify three areas of concern. First, businesses increasingly engage in first-degree price discrimination, siphoning rents from consumers. Se...

Full description

Bibliographic Details
Main Authors: Wagner, G, Eidenmueller, H
Format: Journal article
Language:English
Published: University of Chicago Press 2019
_version_ 1826291950411579392
author Wagner, G
Eidenmueller, H
author_facet Wagner, G
Eidenmueller, H
author_sort Wagner, G
collection OXFORD
description In this Essay, we seek to systematically explore and understand crucial aspects of the dark side of personalized business to consumer (B2C) transactions. We identify three areas of concern. First, businesses increasingly engage in first-degree price discrimination, siphoning rents from consumers. Second, firms exploit widespread or idiosyncratic behavioral biases of consumers in a systematic fashion. And third, businesses use microtargeted ads and recommendations to shape consumers’ preferences and steer them into a particular consumption pattern. Siphoning rents, exploiting biases, and shaping preferences appear to be relatively distinct phenomena. However, these phenomena share a common underlying theme: the exploitation of consumers or at least an impoverishment of their lives by firms that apply novel and sophisticated technological means to maximize profits. Hence, the dark side of personalized B2C transactions may be characterized as consumers being “brought down by algorithms,” losing transaction surplus, engaging in welfare-reducing transactions, and increasingly being trapped in a narrower life. It is unclear whether first-degree price discrimination creates an efficiency problem, but surely it raises concerns of distributive justice. We propose that it should be addressed by a clear and simple warning to the consumer that she is being offered a personalized price and, in addition, a right to indicate that she does not want to participate in a personalized pricing scheme. Similarly, behavioral biases may or may not lead consumers to conclude inefficient transactions. But they should be given an opportunity to reflect on their choices if these have been induced by firms applying exploitative algorithmic sales techniques. Hence, we propose that consumers should have a right to withdraw from a contract concluded under such conditions. Finally, shaping consumers’ preferences by microtargeted ads and recommendations prevents consumers from experimenting. They should have a right to opt out of the technological steering mechanisms created and utilized by firms that impoverish their lives. Regulation along the lines proposed in this Essay is necessary because competitive markets will not protect unknowledgeable or otherwise weak consumers from exploitation. A general “right to anonymity” for consumers in the digital world could be the macrosolution to the microproblems discussed. If it were recognized and protected by the law, it might be possible to reap (most of) the benefits of personalization while avoiding (most of) its pitfalls.
first_indexed 2024-03-07T03:07:12Z
format Journal article
id oxford-uuid:b2f5125c-4406-44ba-b599-61c916820571
institution University of Oxford
language English
last_indexed 2024-03-07T03:07:12Z
publishDate 2019
publisher University of Chicago Press
record_format dspace
spelling oxford-uuid:b2f5125c-4406-44ba-b599-61c9168205712022-03-27T04:15:34ZDown by algorithms? Siphoning rents, exploiting biases, and shaping preferences: Regulating the dark side of personalized transactionsJournal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:b2f5125c-4406-44ba-b599-61c916820571EnglishSymplectic Elements at OxfordUniversity of Chicago Press2019Wagner, GEidenmueller, HIn this Essay, we seek to systematically explore and understand crucial aspects of the dark side of personalized business to consumer (B2C) transactions. We identify three areas of concern. First, businesses increasingly engage in first-degree price discrimination, siphoning rents from consumers. Second, firms exploit widespread or idiosyncratic behavioral biases of consumers in a systematic fashion. And third, businesses use microtargeted ads and recommendations to shape consumers’ preferences and steer them into a particular consumption pattern. Siphoning rents, exploiting biases, and shaping preferences appear to be relatively distinct phenomena. However, these phenomena share a common underlying theme: the exploitation of consumers or at least an impoverishment of their lives by firms that apply novel and sophisticated technological means to maximize profits. Hence, the dark side of personalized B2C transactions may be characterized as consumers being “brought down by algorithms,” losing transaction surplus, engaging in welfare-reducing transactions, and increasingly being trapped in a narrower life. It is unclear whether first-degree price discrimination creates an efficiency problem, but surely it raises concerns of distributive justice. We propose that it should be addressed by a clear and simple warning to the consumer that she is being offered a personalized price and, in addition, a right to indicate that she does not want to participate in a personalized pricing scheme. Similarly, behavioral biases may or may not lead consumers to conclude inefficient transactions. But they should be given an opportunity to reflect on their choices if these have been induced by firms applying exploitative algorithmic sales techniques. Hence, we propose that consumers should have a right to withdraw from a contract concluded under such conditions. Finally, shaping consumers’ preferences by microtargeted ads and recommendations prevents consumers from experimenting. They should have a right to opt out of the technological steering mechanisms created and utilized by firms that impoverish their lives. Regulation along the lines proposed in this Essay is necessary because competitive markets will not protect unknowledgeable or otherwise weak consumers from exploitation. A general “right to anonymity” for consumers in the digital world could be the macrosolution to the microproblems discussed. If it were recognized and protected by the law, it might be possible to reap (most of) the benefits of personalization while avoiding (most of) its pitfalls.
spellingShingle Wagner, G
Eidenmueller, H
Down by algorithms? Siphoning rents, exploiting biases, and shaping preferences: Regulating the dark side of personalized transactions
title Down by algorithms? Siphoning rents, exploiting biases, and shaping preferences: Regulating the dark side of personalized transactions
title_full Down by algorithms? Siphoning rents, exploiting biases, and shaping preferences: Regulating the dark side of personalized transactions
title_fullStr Down by algorithms? Siphoning rents, exploiting biases, and shaping preferences: Regulating the dark side of personalized transactions
title_full_unstemmed Down by algorithms? Siphoning rents, exploiting biases, and shaping preferences: Regulating the dark side of personalized transactions
title_short Down by algorithms? Siphoning rents, exploiting biases, and shaping preferences: Regulating the dark side of personalized transactions
title_sort down by algorithms siphoning rents exploiting biases and shaping preferences regulating the dark side of personalized transactions
work_keys_str_mv AT wagnerg downbyalgorithmssiphoningrentsexploitingbiasesandshapingpreferencesregulatingthedarksideofpersonalizedtransactions
AT eidenmuellerh downbyalgorithmssiphoningrentsexploitingbiasesandshapingpreferencesregulatingthedarksideofpersonalizedtransactions