↓ Skip to main content

Does access to a demand-led evidence briefing service improve uptake and use of research evidence by health service commissioners? A controlled before and after study

Overview of attention for article published in Implementation Science, February 2017
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (95th percentile)
  • High Attention Score compared to outputs of the same age and source (91st percentile)

Mentioned by

twitter
77 X users

Citations

dimensions_citation
13 Dimensions

Readers on

mendeley
55 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Does access to a demand-led evidence briefing service improve uptake and use of research evidence by health service commissioners? A controlled before and after study
Published in
Implementation Science, February 2017
DOI 10.1186/s13012-017-0545-4
Pubmed ID
Authors

Paul M Wilson, Kate Farley, Liz Bickerdike, Alison Booth, Duncan Chambers, Mark Lambert, Carl Thompson, Rhiannon Turner, Ian S Watt

Abstract

The Health and Social Care Act mandated research use as a core consideration of health service commissioning arrangements in England. We undertook a controlled before and after study to evaluate whether access to a demand-led evidence briefing service improved the use of research evidence by commissioners compared with less intensive and less targeted alternatives. Nine Clinical Commissioning Groups (CCGs) in the North of England received one of three interventions: (A) access to an evidence briefing service; (B) contact plus an unsolicited push of non-tailored evidence; or (C) unsolicited push of non-tailored evidence. Data for the primary outcome measure were collected at baseline and 12 months using a survey instrument devised to assess an organisations' ability to acquire, assess, adapt and apply research evidence to support decision-making. Documentary and observational evidence of the use of the outputs of the service were sought. Over the course of the study, the service addressed 24 topics raised by participating CCGs. At 12 months, the evidence briefing service was not associated with increases in CCG capacity to acquire, assess, adapt and apply research evidence to support decision-making, individual intentions to use research findings or perceptions of CCG relationships with researchers. Regardless of intervention received, participating CCGs indicated that they remained inconsistent in their research-seeking behaviours and in their capacity to acquire research. The informal nature of decision-making processes meant that there was little traceability of the use of evidence. Low baseline and follow-up response rates and missing data limit the reliability of the findings. Access to a demand-led evidence briefing service did not improve the uptake and use of research evidence by NHS commissioners compared with less intensive and less targeted alternatives. Commissioners appear well intentioned but ad hoc users of research. Further research is required on the effects of interventions and strategies to build individual and organisational capacity to use research.

X Demographics

X Demographics

The data shown below were collected from the profiles of 77 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 55 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 1 2%
Unknown 54 98%

Demographic breakdown

Readers by professional status Count As %
Student > Master 11 20%
Researcher 8 15%
Student > Ph. D. Student 7 13%
Student > Bachelor 4 7%
Professor > Associate Professor 3 5%
Other 10 18%
Unknown 12 22%
Readers by discipline Count As %
Medicine and Dentistry 12 22%
Social Sciences 8 15%
Psychology 4 7%
Business, Management and Accounting 3 5%
Nursing and Health Professions 3 5%
Other 10 18%
Unknown 15 27%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 46. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 22 June 2022.
All research outputs
#890,790
of 25,022,483 outputs
Outputs from Implementation Science
#114
of 1,795 outputs
Outputs of similar age
#20,339
of 439,003 outputs
Outputs of similar age from Implementation Science
#5
of 45 outputs
Altmetric has tracked 25,022,483 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 96th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,795 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 14.9. This one has done particularly well, scoring higher than 93% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 439,003 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 95% of its contemporaries.
We're also able to compare this research output to 45 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 91% of its contemporaries.