↓ Skip to main content

Updating standards for reporting diagnostic accuracy: the development of STARD 2015

Overview of attention for article published in Research Integrity and Peer Review, June 2016
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (80th percentile)
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
17 X users

Citations

dimensions_citation
52 Dimensions

Readers on

mendeley
78 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Updating standards for reporting diagnostic accuracy: the development of STARD 2015
Published in
Research Integrity and Peer Review, June 2016
DOI 10.1186/s41073-016-0014-7
Pubmed ID
Authors

Daniël A. Korevaar, Jérémie F. Cohen, Johannes B. Reitsma, David E. Bruns, Constantine A. Gatsonis, Paul P. Glasziou, Les Irwig, David Moher, Henrica C. W. de Vet, Douglas G. Altman, Lotty Hooft, Patrick M. M. Bossuyt

Abstract

Although the number of reporting guidelines has grown rapidly, few have gone through an updating process. The STARD statement (Standards for Reporting Diagnostic Accuracy), published in 2003 to help improve the transparency and completeness of reporting of diagnostic accuracy studies, was recently updated in a systematic way. Here, we describe the steps taken and a justification for the changes made. A 4-member Project Team coordinated the updating process; a 14-member Steering Committee was regularly solicited by the Project Team when making critical decisions. First, a review of the literature was performed to identify topics and items potentially relevant to the STARD updating process. After this, the 85 members of the STARD Group were invited to participate in two online surveys to identify items that needed to be modified, removed from, or added to the STARD checklist. Based on the results of the literature review process, 33 items were presented to the STARD Group in the online survey: 25 original items and 8 new items; 73 STARD Group members (86 %) completed the first survey, and 79 STARD Group members (93 %) completed the second survey.Then, an in-person consensus meeting was organized among the members of the Project Team and Steering Committee to develop a consensual draft version of STARD 2015. This version was piloted in three rounds among a total of 32 expert and non-expert users. Piloting mostly led to rewording of items. After this, the update was finalized. The updated STARD 2015 list now consists of 30 items. Compared to the previous version of STARD, three original items were each converted into two new items, four original items were incorporated into other items, and seven new items were added. After a systematic updating process, STARD 2015 provides an updated list of 30 essential items for reporting diagnostic accuracy studies.

X Demographics

X Demographics

The data shown below were collected from the profiles of 17 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 78 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 78 100%

Demographic breakdown

Readers by professional status Count As %
Researcher 12 15%
Student > Master 12 15%
Student > Doctoral Student 11 14%
Other 10 13%
Student > Ph. D. Student 6 8%
Other 12 15%
Unknown 15 19%
Readers by discipline Count As %
Medicine and Dentistry 37 47%
Nursing and Health Professions 9 12%
Biochemistry, Genetics and Molecular Biology 3 4%
Psychology 3 4%
Linguistics 1 1%
Other 10 13%
Unknown 15 19%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 9. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 04 December 2017.
All research outputs
#3,918,426
of 23,929,753 outputs
Outputs from Research Integrity and Peer Review
#107
of 120 outputs
Outputs of similar age
#66,768
of 345,269 outputs
Outputs of similar age from Research Integrity and Peer Review
#8
of 10 outputs
Altmetric has tracked 23,929,753 research outputs across all sources so far. Compared to these this one has done well and is in the 83rd percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 120 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 70.0. This one is in the 11th percentile – i.e., 11% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 345,269 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 80% of its contemporaries.
We're also able to compare this research output to 10 others from the same source and published within six weeks on either side of this one. This one has scored higher than 2 of them.