Down by algorithms? Siphoning rents, exploiting biases, and shaping preferences: Regulating the dark side of personalized transactions

In this Essay, we seek to systematically explore and understand crucial aspects of the dark side of personalized business to consumer (B2C) transactions. We identify three areas of concern. First, businesses increasingly engage in first-degree price discrimination, siphoning rents from consumers. Se...

Celý popis

Podrobná bibliografie
Hlavní autoři: Wagner, G, Eidenmueller, H
Médium: Journal article
Jazyk:English
Vydáno: University of Chicago Press 2019
Popis
Shrnutí:In this Essay, we seek to systematically explore and understand crucial aspects of the dark side of personalized business to consumer (B2C) transactions. We identify three areas of concern. First, businesses increasingly engage in first-degree price discrimination, siphoning rents from consumers. Second, firms exploit widespread or idiosyncratic behavioral biases of consumers in a systematic fashion. And third, businesses use microtargeted ads and recommendations to shape consumers’ preferences and steer them into a particular consumption pattern. Siphoning rents, exploiting biases, and shaping preferences appear to be relatively distinct phenomena. However, these phenomena share a common underlying theme: the exploitation of consumers or at least an impoverishment of their lives by firms that apply novel and sophisticated technological means to maximize profits. Hence, the dark side of personalized B2C transactions may be characterized as consumers being “brought down by algorithms,” losing transaction surplus, engaging in welfare-reducing transactions, and increasingly being trapped in a narrower life. It is unclear whether first-degree price discrimination creates an efficiency problem, but surely it raises concerns of distributive justice. We propose that it should be addressed by a clear and simple warning to the consumer that she is being offered a personalized price and, in addition, a right to indicate that she does not want to participate in a personalized pricing scheme. Similarly, behavioral biases may or may not lead consumers to conclude inefficient transactions. But they should be given an opportunity to reflect on their choices if these have been induced by firms applying exploitative algorithmic sales techniques. Hence, we propose that consumers should have a right to withdraw from a contract concluded under such conditions. Finally, shaping consumers’ preferences by microtargeted ads and recommendations prevents consumers from experimenting. They should have a right to opt out of the technological steering mechanisms created and utilized by firms that impoverish their lives. Regulation along the lines proposed in this Essay is necessary because competitive markets will not protect unknowledgeable or otherwise weak consumers from exploitation. A general “right to anonymity” for consumers in the digital world could be the macrosolution to the microproblems discussed. If it were recognized and protected by the law, it might be possible to reap (most of) the benefits of personalization while avoiding (most of) its pitfalls.