Comparing meta-analyses and preregistered multiple-laboratory replication projects.

2019 
Many researchers rely on meta-analysis to summarize research evidence. However, there is a concern that publication bias and selective reporting may lead to biased meta-analytic effect sizes. We compare the results of meta-analyses to large-scale preregistered replications in psychology carried out at multiple laboratories. The multiple-laboratory replications provide precisely estimated effect sizes that do not suffer from publication bias or selective reporting. We searched the literature and identified 15 meta-analyses on the same topics as multiple-laboratory replications. We find that meta-analytic effect sizes are significantly different from replication effect sizes for 12 out of the 15 meta-replication pairs. These differences are systematic and, on average, meta-analytic effect sizes are almost three times as large as replication effect sizes. We also implement three methods of correcting meta-analysis for bias, but these methods do not substantively improve the meta-analytic results. Kvarven, Stromland and Johannesson compare meta-analyses to multiple-laboratory replication projects and find that meta-analyses overestimate effect sizes by a factor of almost three. Commonly used methods of adjusting for publication bias do not substantively improve results.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    99
    References
    66
    Citations
    NaN
    KQI
    []