↓ Skip to main content

Recommendations for reporting of systematic reviews and meta-analyses of diagnostic test accuracy: a systematic review

Overview of attention for article published in Systematic Reviews, October 2017
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (94th percentile)
  • High Attention Score compared to outputs of the same age and source (88th percentile)

Mentioned by

twitter
85 X users

Citations

dimensions_citation
108 Dimensions

Readers on

mendeley
125 Mendeley
citeulike
1 CiteULike
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Recommendations for reporting of systematic reviews and meta-analyses of diagnostic test accuracy: a systematic review
Published in
Systematic Reviews, October 2017
DOI 10.1186/s13643-017-0590-8
Pubmed ID
Authors

Trevor A. McGrath, Mostafa Alabousi, Becky Skidmore, Daniël A. Korevaar, Patrick M. M. Bossuyt, David Moher, Brett Thombs, Matthew D. F. McInnes

Abstract

This study is to perform a systematic review of existing guidance on quality of reporting and methodology for systematic reviews of diagnostic test accuracy (DTA) in order to compile a list of potential items that might be included in a reporting guideline for such reviews: Preferred Reporting Items for Systematic Reviews and Meta-Analyses of Diagnostic Test Accuracy (PRISMA-DTA). Study protocol published on EQUATOR website. Articles in full text or abstract form that reported on any aspect of reporting systematic reviews of diagnostic test accuracy were eligible for inclusion. We used the Ovid platform to search Ovid MEDLINE®, Ovid MEDLINE® In-Process & Other Non-Indexed Citations and Embase Classic+Embase through May 5, 2016. The Cochrane Methodology Register in the Cochrane Library (Wiley version) was also searched. Title and abstract screening followed by full-text screening of all search results was performed independently by two investigators. Guideline organization websites, published guidance statements, and the Cochrane Handbook for Diagnostic Test Accuracy were also searched. Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) and Standards for Reporting Diagnostic Accuracy (STARD) were assessed independently by two investigators for relevant items. The literature searched yielded 6967 results; 386 were included after title and abstract screening and 203 after full-text screening. After reviewing the existing literature and guidance documents, a preliminary list of 64 items was compiled into the following categories: title (three items); introduction (two items); methods (35 items); results (13 items); discussion (nine items), and disclosure (two items). Items on the methods and reporting of DTA systematic reviews in the present systematic review will provide a basis for generating a PRISMA extension for DTA systematic reviews.

X Demographics

X Demographics

The data shown below were collected from the profiles of 85 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 125 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 125 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 20 16%
Researcher 14 11%
Student > Ph. D. Student 13 10%
Other 12 10%
Student > Postgraduate 9 7%
Other 33 26%
Unknown 24 19%
Readers by discipline Count As %
Medicine and Dentistry 48 38%
Social Sciences 9 7%
Nursing and Health Professions 7 6%
Engineering 6 5%
Biochemistry, Genetics and Molecular Biology 5 4%
Other 15 12%
Unknown 35 28%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 49. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 20 March 2018.
All research outputs
#874,578
of 25,595,500 outputs
Outputs from Systematic Reviews
#109
of 2,242 outputs
Outputs of similar age
#18,139
of 334,255 outputs
Outputs of similar age from Systematic Reviews
#6
of 44 outputs
Altmetric has tracked 25,595,500 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 96th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 2,242 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 13.2. This one has done particularly well, scoring higher than 95% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 334,255 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 94% of its contemporaries.
We're also able to compare this research output to 44 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 88% of its contemporaries.