↓ Skip to main content

Effects of a demand-led evidence briefing service on the uptake and use of research evidence by commissioners of health services: protocol for a controlled before and after study

Overview of attention for article published in Implementation Science, January 2015
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (91st percentile)
  • High Attention Score compared to outputs of the same age and source (82nd percentile)

Mentioned by

twitter
26 tweeters

Citations

dimensions_citation
7 Dimensions

Readers on

mendeley
46 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Effects of a demand-led evidence briefing service on the uptake and use of research evidence by commissioners of health services: protocol for a controlled before and after study
Published in
Implementation Science, January 2015
DOI 10.1186/s13012-014-0199-4
Pubmed ID
Authors

Paul M Wilson, Kate Farley, Carl Thompson, Duncan Chambers, Liz Bickerdike, Ian S Watt, Mark Lambert, Rhiannon Turner

Abstract

BackgroundClinical Commissioning Groups (CCGs) are mandated to use research evidence effectively to ensure optimum use of resources by the National Health Service (NHS), both in accelerating innovation and in stopping the use of less effective practices and models of service delivery. We intend to evaluate whether access to a demand-led evidence service improves uptake and use of research evidence by NHS commissioners compared with less intensive and less targeted alternatives.Methods/designThis is a controlled before and after study involving CCGs in the North of England. Participating CCGs will receive one of three interventions to support the use of research evidence in their decision-making: 1) consulting plus responsive push of tailored evidence; 2) consulting plus an unsolicited push of non-tailored evidence; or 3) standard service unsolicited push of non-tailored evidence. Our primary outcome will be changed at 12 months from baseline of a CCGs ability to acquire, assess, adapt and apply research evidence to support decision-making. Secondary outcomes will measure individual clinical leads and managers¿ intentions to use research evidence in decision making. Documentary evidence of the use of the outputs of the service will be sought. A process evaluation will evaluate the nature and success of the interactions both within the sites and between commissioners and researchers delivering the service.DiscussionThe proposed research will generate new knowledge of direct relevance and value to the NHS. The findings will help to clarify which elements of the service are of value in promoting the use of research evidence. Those involved in NHS commissioning will be able to use the results to inform how best to build the infrastructure they need to acquire, assess, adapt and apply research evidence to support decision-making and to fulfil their statutory duties under the Health and Social Care Act.

Twitter Demographics

The data shown below were collected from the profiles of 26 tweeters who shared this research output. Click here to find out more about how the information was compiled.

Mendeley readers

The data shown below were compiled from readership statistics for 46 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 1 2%
Brazil 1 2%
Unknown 44 96%

Demographic breakdown

Readers by professional status Count As %
Researcher 8 17%
Student > Ph. D. Student 7 15%
Student > Master 6 13%
Student > Doctoral Student 5 11%
Professor 5 11%
Other 11 24%
Unknown 4 9%
Readers by discipline Count As %
Medicine and Dentistry 13 28%
Social Sciences 9 20%
Engineering 5 11%
Nursing and Health Professions 4 9%
Computer Science 2 4%
Other 8 17%
Unknown 5 11%

Attention Score in Context

This research output has an Altmetric Attention Score of 17. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 16 February 2017.
All research outputs
#1,265,751
of 16,135,357 outputs
Outputs from Implementation Science
#360
of 1,520 outputs
Outputs of similar age
#25,598
of 307,167 outputs
Outputs of similar age from Implementation Science
#25
of 142 outputs
Altmetric has tracked 16,135,357 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 92nd percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,520 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 14.0. This one has done well, scoring higher than 76% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 307,167 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 91% of its contemporaries.
We're also able to compare this research output to 142 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 82% of its contemporaries.