↓ Skip to main content

What information is used in treatment decision aids? A systematic review of the types of evidence populating health decision aids

Overview of attention for article published in BMC Medical Informatics and Decision Making, February 2017
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (85th percentile)
  • Good Attention Score compared to outputs of the same age and source (73rd percentile)

Mentioned by

twitter
20 X users

Readers on

mendeley
69 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
What information is used in treatment decision aids? A systematic review of the types of evidence populating health decision aids
Published in
BMC Medical Informatics and Decision Making, February 2017
DOI 10.1186/s12911-017-0415-7
Pubmed ID
Authors

Amanda M. Clifford, Jean Ryan, Cathal Walsh, Arlene McCurtin

Abstract

Patient decision aids (DAs) are support tools designed to provide patients with relevant information to help them make informed decisions about their healthcare. While DAs can be effective in improving patient knowledge and decision quality, it is unknown what types of information and evidence are used to populate such decision tools. Systematic methods were used to identify and appraise the relevant literature and patient DAs published between 2006 and 2015. Six databases (Academic Search Complete, AMED, CINAHL, Biomedical Reference Collection, General Sciences and MEDLINE) and reference list searching were used. Articles evaluating the effectiveness of the DAs were appraised using the Cochrane Risk of Bias tool. The content, quality and sources of evidence in the decision aids were evaluated using the IPDASi-SF and a novel classification system. Findings were synthesised and a narrative analysis was performed on the results. Thirteen studies representing ten DAs met the inclusion criteria. The IPDASI-SF score ranged from 9 to 16 indicating many of the studies met the majority of quality criteria. Sources of evidence were described but reports were sometimes generic or missing important information. The majority of DAs incorporated high quality research evidence including systematic reviews and meta-analyses. Patient and practice evidence was less commonly employed, with only a third of included DAs using these to populate decision aid content. The quality of practice and patient evidence ranged from high to low. Contextual factors were addressed across all DAs to varying degrees and covered a range of factors. This is an initial study examining the information and evidence used to populate DAs. While research evidence and contextual factors are well represented in included DAs, consideration should be given to incorporating high quality information representing all four pillars of evidence based practice when developing DAs. Further, patient and expert practice evidence should be acquired rigorously and DAs should report the means by which such evidence is obtained with citations clearly provided.

X Demographics

X Demographics

The data shown below were collected from the profiles of 20 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 69 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 1 1%
Unknown 68 99%

Demographic breakdown

Readers by professional status Count As %
Student > Master 13 19%
Researcher 11 16%
Student > Ph. D. Student 6 9%
Librarian 5 7%
Professor 5 7%
Other 19 28%
Unknown 10 14%
Readers by discipline Count As %
Medicine and Dentistry 19 28%
Nursing and Health Professions 11 16%
Social Sciences 9 13%
Psychology 5 7%
Agricultural and Biological Sciences 3 4%
Other 9 13%
Unknown 13 19%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 14. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 10 March 2020.
All research outputs
#2,251,422
of 22,955,959 outputs
Outputs from BMC Medical Informatics and Decision Making
#146
of 2,001 outputs
Outputs of similar age
#45,385
of 311,210 outputs
Outputs of similar age from BMC Medical Informatics and Decision Making
#6
of 23 outputs
Altmetric has tracked 22,955,959 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 90th percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 2,001 research outputs from this source. They receive a mean Attention Score of 4.9. This one has done particularly well, scoring higher than 92% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 311,210 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 85% of its contemporaries.
We're also able to compare this research output to 23 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 73% of its contemporaries.