Is there evidence of publication biases in JDM research?

It is a long known problem that the preferential publication of statistically significant results (publication bias) may lead to incorrect estimates of the true effects being investigated. Even though other research areas (e.g., medicine, biology) are aware of the problem, and have identified strong...

Full description

Bibliographic Details
Main Authors: Frank Renkewitz, Heather M. Fuchs, Susann Fiedler, Andreas Glöckner, Benjamin E. Hilbig
Format: Article
Language:English
Published: Cambridge University Press 2011-12-01
Series:Judgment and Decision Making
Subjects:
Online Access:https://www.cambridge.org/core/product/identifier/S1930297500004289/type/journal_article
Description
Summary:It is a long known problem that the preferential publication of statistically significant results (publication bias) may lead to incorrect estimates of the true effects being investigated. Even though other research areas (e.g., medicine, biology) are aware of the problem, and have identified strong publication biases, researchers in judgment and decision making (JDM) largely ignore it. We reanalyzed two current meta-analyses in this area. Both showed evidence of publication biases that may have led to a substantial overestimation of the true effects they investigated. A review of additional JDM meta-analyses shows that most meta-analyses conducted no or insufficient analyses of publication bias. However, given our results and the rareness of non-significant effects in the literature, we suspect that biases occur quite often. These findings suggest that (a) conclusions based on meta-analyses without reported tests of publication bias should be interpreted with caution and (b) publication policies and standard research practices should be revised to overcome the problem.
ISSN:1930-2975