↓ Skip to main content

On the reproducibility of meta-analyses: six practical recommendations

Overview of attention for article published in BMC Psychology, May 2016
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • Among the highest-scoring outputs from this source (#38 of 1,136)
  • High Attention Score compared to outputs of the same age (97th percentile)
  • High Attention Score compared to outputs of the same age and source (82nd percentile)

Mentioned by

blogs
1 blog
twitter
145 X users
wikipedia
2 Wikipedia pages

Citations

dimensions_citation
167 Dimensions

Readers on

mendeley
266 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
On the reproducibility of meta-analyses: six practical recommendations
Published in
BMC Psychology, May 2016
DOI 10.1186/s40359-016-0126-3
Pubmed ID
Authors

Daniël Lakens, Joe Hilgard, Janneke Staaks

Abstract

Meta-analyses play an important role in cumulative science by combining information across multiple studies and attempting to provide effect size estimates corrected for publication bias. Research on the reproducibility of meta-analyses reveals that errors are common, and the percentage of effect size calculations that cannot be reproduced is much higher than is desirable. Furthermore, the flexibility in inclusion criteria when performing a meta-analysis, combined with the many conflicting conclusions drawn by meta-analyses of the same set of studies performed by different researchers, has led some people to doubt whether meta-analyses can provide objective conclusions. The present article highlights the need to improve the reproducibility of meta-analyses to facilitate the identification of errors, allow researchers to examine the impact of subjective choices such as inclusion criteria, and update the meta-analysis after several years. Reproducibility can be improved by applying standardized reporting guidelines and sharing all meta-analytic data underlying the meta-analysis, including quotes from articles to specify how effect sizes were calculated. Pre-registration of the research protocol (which can be peer-reviewed using novel 'registered report' formats) can be used to distinguish a-priori analysis plans from data-driven choices, and reduce the amount of criticism after the results are known. The recommendations put forward in this article aim to improve the reproducibility of meta-analyses. In addition, they have the benefit of "future-proofing" meta-analyses by allowing the shared data to be re-analyzed as new theoretical viewpoints emerge or as novel statistical techniques are developed. Adoption of these practices will lead to increased credibility of meta-analytic conclusions, and facilitate cumulative scientific knowledge.

X Demographics

X Demographics

The data shown below were collected from the profiles of 145 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 266 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 3 1%
Chile 2 <1%
United States 2 <1%
Italy 1 <1%
Brazil 1 <1%
Germany 1 <1%
Bosnia and Herzegovina 1 <1%
Macao 1 <1%
Unknown 254 95%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 64 24%
Researcher 35 13%
Student > Master 31 12%
Student > Doctoral Student 20 8%
Student > Bachelor 20 8%
Other 62 23%
Unknown 34 13%
Readers by discipline Count As %
Psychology 110 41%
Social Sciences 24 9%
Medicine and Dentistry 16 6%
Agricultural and Biological Sciences 10 4%
Computer Science 8 3%
Other 51 19%
Unknown 47 18%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 91. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 18 November 2023.
All research outputs
#472,822
of 26,017,215 outputs
Outputs from BMC Psychology
#38
of 1,136 outputs
Outputs of similar age
#9,214
of 358,441 outputs
Outputs of similar age from BMC Psychology
#3
of 17 outputs
Altmetric has tracked 26,017,215 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 97th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,136 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 16.4. This one has done particularly well, scoring higher than 96% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 358,441 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 97% of its contemporaries.
We're also able to compare this research output to 17 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 82% of its contemporaries.