The transparency of quantitative empirical legal research published in highly ranked law journals (2018–2020): an observational study [version 2; peer review: 3 approved]

Background Scientists are increasingly concerned with making their work easy to verify and build upon. Associated practices include sharing data, materials, and analytic scripts, and preregistering protocols. This shift towards increased transparency and rigor has been referred to as a “credibility...

Full description

Bibliographic Details
Main Authors: Sarah Schiavone, Alex Holcombe, Ruby Bishop, Simine Vazire, Natali Dilevski, Rosemary Gatfield-Jeffries, Jason Chin, Kathryn Zeiler
Format: Article
Language:English
Published: F1000 Research Ltd 2024-03-01
Series:F1000Research
Subjects:
Online Access:https://f1000research.com/articles/12-144/v2
_version_ 1797269120096403456
author Sarah Schiavone
Alex Holcombe
Ruby Bishop
Simine Vazire
Natali Dilevski
Rosemary Gatfield-Jeffries
Jason Chin
Kathryn Zeiler
author_facet Sarah Schiavone
Alex Holcombe
Ruby Bishop
Simine Vazire
Natali Dilevski
Rosemary Gatfield-Jeffries
Jason Chin
Kathryn Zeiler
author_sort Sarah Schiavone
collection DOAJ
description Background Scientists are increasingly concerned with making their work easy to verify and build upon. Associated practices include sharing data, materials, and analytic scripts, and preregistering protocols. This shift towards increased transparency and rigor has been referred to as a “credibility revolution.” The credibility of empirical legal research has been questioned in the past due to its distinctive peer review system and because the legal background of its researchers means that many often are not trained in study design or statistics. Still, there has been no systematic study of transparency and credibility-related characteristics of published empirical legal research. Methods To fill this gap and provide an estimate of current practices that can be tracked as the field evolves, we assessed 300 empirical articles from highly ranked law journals including both faculty-edited journals and student-edited journals. Results We found high levels of article accessibility (86%, 95% CI = [82%, 90%]), especially among student-edited journals (100%). Few articles stated that a study’s data are available (19%, 95% CI = [15%, 23%]). Statements of preregistration (3%, 95% CI = [1%, 5%]) and availability of analytic scripts (6%, 95% CI = [4%, 9%]) were very uncommon. (i.e., they collected new data using the study’s reported methods, but found results inconsistent or not as strong as the original). Conclusion We suggest that empirical legal researchers and the journals that publish their work cultivate norms and practices to encourage research credibility. Our estimates may be revisited to track the field’s progress in the coming years.
first_indexed 2024-04-25T01:43:19Z
format Article
id doaj.art-8da1060cac584119aceff6ef6a2e3b27
institution Directory Open Access Journal
issn 2046-1402
language English
last_indexed 2024-04-25T01:43:19Z
publishDate 2024-03-01
publisher F1000 Research Ltd
record_format Article
series F1000Research
spelling doaj.art-8da1060cac584119aceff6ef6a2e3b272024-03-08T01:00:01ZengF1000 Research LtdF1000Research2046-14022024-03-0112161525The transparency of quantitative empirical legal research published in highly ranked law journals (2018–2020): an observational study [version 2; peer review: 3 approved]Sarah Schiavone0Alex Holcombe1https://orcid.org/0000-0003-2869-0085Ruby Bishop2Simine Vazire3Natali Dilevski4Rosemary Gatfield-Jeffries5https://orcid.org/0000-0003-0770-7319Jason Chin6https://orcid.org/0000-0002-6573-2670Kathryn Zeiler7https://orcid.org/0000-0001-5775-1321Psychology, University of California, Davis, Davis, CA, USAPsychology, University of Sydney, Sydney, NSW, AustraliaSchool of Law, University of Sydney, Sydney, NSW, AustraliaMelbourne School of Psychological Sciences, University of Melbourne, Melbourne, Vic, AustraliaCentre for Investigative Interviewing, Griffith Criminology Institute, Griffith University, Brisbane, Qld, AustraliaHistory and Philosophy of Science, University of Cambridge, Cambridge, UKCollege of Law, Australian National University, Canberra, ACT, AustraliaSchool of Law, University of Boston, Boston, MA, USABackground Scientists are increasingly concerned with making their work easy to verify and build upon. Associated practices include sharing data, materials, and analytic scripts, and preregistering protocols. This shift towards increased transparency and rigor has been referred to as a “credibility revolution.” The credibility of empirical legal research has been questioned in the past due to its distinctive peer review system and because the legal background of its researchers means that many often are not trained in study design or statistics. Still, there has been no systematic study of transparency and credibility-related characteristics of published empirical legal research. Methods To fill this gap and provide an estimate of current practices that can be tracked as the field evolves, we assessed 300 empirical articles from highly ranked law journals including both faculty-edited journals and student-edited journals. Results We found high levels of article accessibility (86%, 95% CI = [82%, 90%]), especially among student-edited journals (100%). Few articles stated that a study’s data are available (19%, 95% CI = [15%, 23%]). Statements of preregistration (3%, 95% CI = [1%, 5%]) and availability of analytic scripts (6%, 95% CI = [4%, 9%]) were very uncommon. (i.e., they collected new data using the study’s reported methods, but found results inconsistent or not as strong as the original). Conclusion We suggest that empirical legal researchers and the journals that publish their work cultivate norms and practices to encourage research credibility. Our estimates may be revisited to track the field’s progress in the coming years.https://f1000research.com/articles/12-144/v2metaresearch open science transparency credibility empirical legal researcheng
spellingShingle Sarah Schiavone
Alex Holcombe
Ruby Bishop
Simine Vazire
Natali Dilevski
Rosemary Gatfield-Jeffries
Jason Chin
Kathryn Zeiler
The transparency of quantitative empirical legal research published in highly ranked law journals (2018–2020): an observational study [version 2; peer review: 3 approved]
F1000Research
metaresearch
open science
transparency
credibility
empirical legal research
eng
title The transparency of quantitative empirical legal research published in highly ranked law journals (2018–2020): an observational study [version 2; peer review: 3 approved]
title_full The transparency of quantitative empirical legal research published in highly ranked law journals (2018–2020): an observational study [version 2; peer review: 3 approved]
title_fullStr The transparency of quantitative empirical legal research published in highly ranked law journals (2018–2020): an observational study [version 2; peer review: 3 approved]
title_full_unstemmed The transparency of quantitative empirical legal research published in highly ranked law journals (2018–2020): an observational study [version 2; peer review: 3 approved]
title_short The transparency of quantitative empirical legal research published in highly ranked law journals (2018–2020): an observational study [version 2; peer review: 3 approved]
title_sort transparency of quantitative empirical legal research published in highly ranked law journals 2018 2020 an observational study version 2 peer review 3 approved
topic metaresearch
open science
transparency
credibility
empirical legal research
eng
url https://f1000research.com/articles/12-144/v2
work_keys_str_mv AT sarahschiavone thetransparencyofquantitativeempiricallegalresearchpublishedinhighlyrankedlawjournals20182020anobservationalstudyversion2peerreview3approved
AT alexholcombe thetransparencyofquantitativeempiricallegalresearchpublishedinhighlyrankedlawjournals20182020anobservationalstudyversion2peerreview3approved
AT rubybishop thetransparencyofquantitativeempiricallegalresearchpublishedinhighlyrankedlawjournals20182020anobservationalstudyversion2peerreview3approved
AT siminevazire thetransparencyofquantitativeempiricallegalresearchpublishedinhighlyrankedlawjournals20182020anobservationalstudyversion2peerreview3approved
AT natalidilevski thetransparencyofquantitativeempiricallegalresearchpublishedinhighlyrankedlawjournals20182020anobservationalstudyversion2peerreview3approved
AT rosemarygatfieldjeffries thetransparencyofquantitativeempiricallegalresearchpublishedinhighlyrankedlawjournals20182020anobservationalstudyversion2peerreview3approved
AT jasonchin thetransparencyofquantitativeempiricallegalresearchpublishedinhighlyrankedlawjournals20182020anobservationalstudyversion2peerreview3approved
AT kathrynzeiler thetransparencyofquantitativeempiricallegalresearchpublishedinhighlyrankedlawjournals20182020anobservationalstudyversion2peerreview3approved
AT sarahschiavone transparencyofquantitativeempiricallegalresearchpublishedinhighlyrankedlawjournals20182020anobservationalstudyversion2peerreview3approved
AT alexholcombe transparencyofquantitativeempiricallegalresearchpublishedinhighlyrankedlawjournals20182020anobservationalstudyversion2peerreview3approved
AT rubybishop transparencyofquantitativeempiricallegalresearchpublishedinhighlyrankedlawjournals20182020anobservationalstudyversion2peerreview3approved
AT siminevazire transparencyofquantitativeempiricallegalresearchpublishedinhighlyrankedlawjournals20182020anobservationalstudyversion2peerreview3approved
AT natalidilevski transparencyofquantitativeempiricallegalresearchpublishedinhighlyrankedlawjournals20182020anobservationalstudyversion2peerreview3approved
AT rosemarygatfieldjeffries transparencyofquantitativeempiricallegalresearchpublishedinhighlyrankedlawjournals20182020anobservationalstudyversion2peerreview3approved
AT jasonchin transparencyofquantitativeempiricallegalresearchpublishedinhighlyrankedlawjournals20182020anobservationalstudyversion2peerreview3approved
AT kathrynzeiler transparencyofquantitativeempiricallegalresearchpublishedinhighlyrankedlawjournals20182020anobservationalstudyversion2peerreview3approved