↓ Skip to main content

Evaluations of the uptake and impact of the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) Statement and extensions: a scoping review

Overview of attention for article published in Systematic Reviews, December 2017
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • Among the highest-scoring outputs from this source (#11 of 2,005)
  • High Attention Score compared to outputs of the same age (99th percentile)
  • High Attention Score compared to outputs of the same age and source (99th percentile)

Mentioned by

blogs
1 blog
twitter
355 tweeters
facebook
1 Facebook page
googleplus
1 Google+ user

Citations

dimensions_citation
287 Dimensions

Readers on

mendeley
421 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Evaluations of the uptake and impact of the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) Statement and extensions: a scoping review
Published in
Systematic Reviews, December 2017
DOI 10.1186/s13643-017-0663-8
Pubmed ID
Authors

Matthew J. Page, David Moher

Abstract

The PRISMA Statement is a reporting guideline designed to improve transparency of systematic reviews (SRs) and meta-analyses. Seven extensions to the PRISMA Statement have been published to address the reporting of different types or aspects of SRs, and another eight are in development. We performed a scoping review to map the research that has been conducted to evaluate the uptake and impact of the PRISMA Statement and extensions. We also synthesised studies evaluating how well SRs published after the PRISMA Statement was disseminated adhere to its recommendations. We searched for meta-research studies indexed in MEDLINE® from inception to 31 July 2017, which investigated some component of the PRISMA Statement or extensions (e.g. SR adherence to PRISMA, journal endorsement of PRISMA). One author screened all records and classified the types of evidence available in the studies. We pooled data on SR adherence to individual PRISMA items across all SRs in the included studies and across SRs published after 2009 (the year PRISMA was disseminated). We included 100 meta-research studies. The most common type of evidence available was data on SR adherence to the PRISMA Statement, which has been evaluated in 57 studies that have assessed 6487 SRs. The pooled results of these studies suggest that reporting of many items in the PRISMA Statement is suboptimal, even in the 2382 SRs published after 2009 (where nine items were adhered to by fewer than 67% of SRs). Few meta-research studies have evaluated the adherence of SRs to the PRISMA extensions or strategies to increase adherence to the PRISMA Statement and extensions. Many studies have evaluated how well SRs adhere to the PRISMA Statement, and the pooled result of these suggest that reporting of many items is suboptimal. An update of the PRISMA Statement, along with a toolkit of strategies to help journals endorse and implement the updated guideline, may improve the transparency of SRs.

Twitter Demographics

The data shown below were collected from the profiles of 355 tweeters who shared this research output. Click here to find out more about how the information was compiled.

Mendeley readers

The data shown below were compiled from readership statistics for 421 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 421 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 44 10%
Student > Ph. D. Student 32 8%
Student > Bachelor 28 7%
Student > Doctoral Student 27 6%
Researcher 20 5%
Other 86 20%
Unknown 184 44%
Readers by discipline Count As %
Medicine and Dentistry 67 16%
Nursing and Health Professions 25 6%
Unspecified 18 4%
Social Sciences 15 4%
Engineering 14 3%
Other 77 18%
Unknown 205 49%

Attention Score in Context

This research output has an Altmetric Attention Score of 222. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 16 June 2022.
All research outputs
#143,608
of 22,955,959 outputs
Outputs from Systematic Reviews
#11
of 2,005 outputs
Outputs of similar age
#3,626
of 440,014 outputs
Outputs of similar age from Systematic Reviews
#1
of 56 outputs
Altmetric has tracked 22,955,959 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 99th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 2,005 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.7. This one has done particularly well, scoring higher than 99% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 440,014 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 99% of its contemporaries.
We're also able to compare this research output to 56 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 99% of its contemporaries.