News Release

An analysis of psychological meta-analyses reveals a reproducibility problem

Peer-Reviewed Publication


Meta-analysis research studies in psychology aren't always reproducible due to a lack of transparency of reporting in the meta-analysis process, according to a new study published May 27, 2020 in the open-access journal PLOS ONE by Esther Maassen of Tilburg University, the Netherlands, and colleagues.

Meta-analysis is a widely used method to combine and compare quantitative data from multiple primary studies. The statistical approach used in meta-analyses can reveal whether study outcomes differ based on particular study characteristics, and help compute an overall effect size--for instance, the magnitude of a treatment effect--for the topic of interest. However, many steps of a meta-analysis involve decisions and judgements that can be arbitrary or differ by researcher.

In the new study, researchers analyzed 33 meta-analysis articles in the field of psychology. The meta-analytical studies were all published in 2011 and 2012, all had data tables with primary studies, and all included at least ten primary studies. For each meta-analysis, the team searched for the corresponding primary study articles, followed any methods detailed in the meta-analysis article, and recomputed a total of 500 effect sizes reported in the meta-analyses.

Out of 500 primary study effect sizes, the researchers were able to reproduce 276 (55%) without any problems. (In this case, reproducibility was defined as arriving at the same result after reanalyzing the same data following the reported procedures.) However, in some cases, the meta-analyses did not contain enough information to reproduce the study effect size, while in others a different effect than stated was calculated. 114 effect sizes (23%) showed discrepancies compared to what was reported in the meta-analytical article. 30 of the 33 meta-analyses contained at least one effect size that could not be easily reproduced.

When the erroneous or unreproducible effect sizes were integrated into each meta-analysis itself, the team found that 13 of the 33 (39%) meta-analyses had discrepancies in their results, although many were negligible. The researchers recommend adding to existing guidelines for the publication of psychological meta-analyses to make them more reproducible.

The authors add: Individual effect sizes from meta-analyses in psychology are difficult to reproduce due to inaccurate and incomplete reporting in the meta-analysis. To increase the trustworthiness of meta-analytic results, it is essential that researchers explicitly document their data handling practices and workflow, as well as publish their data and code online.


Citation: Maassen E, van Assen MALM, Nuijten MB, Olsson-Collentine A, Wicherts JM (2020) Reproducibility of individual effect sizes in meta-analyses in psychology. PLoS ONE 15(5): e0233107.

Funding: This research was supported by a Consolidator Grant 726361 (IMPROVE) from the European Research Council (ERC,, awarded to J.M. Wicherts. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Competing Interests: I have read the journal's policy and the authors of this manuscript have the following competing interests: Jelte M. Wicherts is a PLOS ONE Editorial Board member. This does not alter the authors' adherence to PLOS ONE Editorial policies and criteria.

In your coverage please use this URL to provide access to the freely available article in PLOS ONE:

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.