How do online users respond to crowdsourced fact-checking?

Abstract Recently, crowdsourcing has been proposed as a tool for fighting misinformation online. Will internet users listen to crowdsourced fact-checking, and how? In this experiment we test how participants follow others’ opinions to evaluate the validity of a science-themed Facebook post and exami...

Full description

Bibliographic Details
Main Authors: Folco Panizza, Piero Ronzani, Tiffany Morisseau, Simone Mattavelli, Carlo Martini
Format: Article
Language:English
Published: Springer Nature 2023-11-01
Series:Humanities & Social Sciences Communications
Online Access:https://doi.org/10.1057/s41599-023-02329-y
_version_ 1797453521847582720
author Folco Panizza
Piero Ronzani
Tiffany Morisseau
Simone Mattavelli
Carlo Martini
author_facet Folco Panizza
Piero Ronzani
Tiffany Morisseau
Simone Mattavelli
Carlo Martini
author_sort Folco Panizza
collection DOAJ
description Abstract Recently, crowdsourcing has been proposed as a tool for fighting misinformation online. Will internet users listen to crowdsourced fact-checking, and how? In this experiment we test how participants follow others’ opinions to evaluate the validity of a science-themed Facebook post and examine which factors mediate the use of this information. Participants observed a post presenting either scientific information or misinformation, along with a graphical summary of previous participants’ judgements. Even though most participants reported not having used information from previous raters, their responses were influenced by previous assessments. This happened regardless of whether prior judgements were accurate or misleading. Presenting crowdsourced fact-checking however did not translate into the blind copying of the majority response. Rather, participants tended to use this social information as a cue to guide their response, while also relying on individual evaluation and research for extra information. These results highlight the role of individual reasoning when evaluating online information, while pointing to the potential benefit of crowd-sourcing-based solutions in making online users more resilient to misinformation.
first_indexed 2024-03-09T15:24:04Z
format Article
id doaj.art-16e01f9aded3429b93177d0f2763c75c
institution Directory Open Access Journal
issn 2662-9992
language English
last_indexed 2024-03-09T15:24:04Z
publishDate 2023-11-01
publisher Springer Nature
record_format Article
series Humanities & Social Sciences Communications
spelling doaj.art-16e01f9aded3429b93177d0f2763c75c2023-11-26T12:38:07ZengSpringer NatureHumanities & Social Sciences Communications2662-99922023-11-0110111110.1057/s41599-023-02329-yHow do online users respond to crowdsourced fact-checking?Folco Panizza0Piero Ronzani1Tiffany Morisseau2Simone Mattavelli3Carlo Martini4Molecular Mind Laboratory, IMT School for Advanced Studies LuccaInternational Security and Development CenterUniversité Paris Cité and Univ. Gustave Eiffel, LaPEADepartment of Psychology, Bicocca UniversityCentre for Applied and Experimental Epistemology, Department of Philosophy, Vita-Salute San Raffaele UniversityAbstract Recently, crowdsourcing has been proposed as a tool for fighting misinformation online. Will internet users listen to crowdsourced fact-checking, and how? In this experiment we test how participants follow others’ opinions to evaluate the validity of a science-themed Facebook post and examine which factors mediate the use of this information. Participants observed a post presenting either scientific information or misinformation, along with a graphical summary of previous participants’ judgements. Even though most participants reported not having used information from previous raters, their responses were influenced by previous assessments. This happened regardless of whether prior judgements were accurate or misleading. Presenting crowdsourced fact-checking however did not translate into the blind copying of the majority response. Rather, participants tended to use this social information as a cue to guide their response, while also relying on individual evaluation and research for extra information. These results highlight the role of individual reasoning when evaluating online information, while pointing to the potential benefit of crowd-sourcing-based solutions in making online users more resilient to misinformation.https://doi.org/10.1057/s41599-023-02329-y
spellingShingle Folco Panizza
Piero Ronzani
Tiffany Morisseau
Simone Mattavelli
Carlo Martini
How do online users respond to crowdsourced fact-checking?
Humanities & Social Sciences Communications
title How do online users respond to crowdsourced fact-checking?
title_full How do online users respond to crowdsourced fact-checking?
title_fullStr How do online users respond to crowdsourced fact-checking?
title_full_unstemmed How do online users respond to crowdsourced fact-checking?
title_short How do online users respond to crowdsourced fact-checking?
title_sort how do online users respond to crowdsourced fact checking
url https://doi.org/10.1057/s41599-023-02329-y
work_keys_str_mv AT folcopanizza howdoonlineusersrespondtocrowdsourcedfactchecking
AT pieroronzani howdoonlineusersrespondtocrowdsourcedfactchecking
AT tiffanymorisseau howdoonlineusersrespondtocrowdsourcedfactchecking
AT simonemattavelli howdoonlineusersrespondtocrowdsourcedfactchecking
AT carlomartini howdoonlineusersrespondtocrowdsourcedfactchecking