↓ Skip to main content

Designing a rapid response program to support evidence-informed decision-making in the Americas region: using the best available evidence and case studies

Overview of attention for article published in Implementation Science, August 2016
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (93rd percentile)
  • High Attention Score compared to outputs of the same age and source (87th percentile)

Mentioned by

policy
1 policy source
twitter
43 X users
facebook
3 Facebook pages

Readers on

mendeley
124 Mendeley
citeulike
1 CiteULike
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Designing a rapid response program to support evidence-informed decision-making in the Americas region: using the best available evidence and case studies
Published in
Implementation Science, August 2016
DOI 10.1186/s13012-016-0472-9
Pubmed ID
Authors

Michelle M. Haby, Evelina Chapman, Rachel Clark, Jorge Barreto, Ludovic Reveiz, John N. Lavis

Abstract

The objective of this work was to inform the design of a rapid response program to support evidence-informed decision-making in health policy and practice for the Americas region. Specifically, we focus on the following: (1) What are the best methodological approaches for rapid reviews of the research evidence? (2) What other strategies are needed to facilitate evidence-informed decision-making in health policy and practice? and (3) How best to operationalize a rapid response program? The evidence used to inform the design of a rapid response program included (i) two rapid reviews of methodological approaches for rapid reviews of the research evidence and strategies to facilitate evidence-informed decision-making, (ii) supplementary literature in relation to the "shortcuts" that could be considered to reduce the time needed to complete rapid reviews, (iii) four case studies, and (iv) supplementary literature to identify additional operational issues for the design of the program. There is no agreed definition of rapid reviews in the literature and no agreed methodology for conducting them. Better reporting of rapid review methods is needed. The literature found in relation to shortcuts will be helpful in choosing shortcuts that maximize timeliness while minimizing the impact on quality. Evidence for other strategies that can be used concurrently to facilitate the uptake of research evidence, including evidence drawn from rapid reviews, is presented. Operational issues that need to be considered in designing a rapid response program include the implications of a "user-pays" model, the importance of recruiting staff with the right mix of skills and qualifications, and ensuring that the impact of the model on research use in decision-making is formally evaluated. When designing a new rapid response program, greater attention needs to be given to specifying the rapid review methods and reporting these in sufficient detail to allow a quality assessment. It will also be important to engage in other strategies to facilitate the uptake of the rapid reviews and to evaluate the chosen model in order to make refinements and add to the evidence base for evidence-informed decision-making.

X Demographics

X Demographics

The data shown below were collected from the profiles of 43 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 124 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 1 <1%
Unknown 123 99%

Demographic breakdown

Readers by professional status Count As %
Researcher 28 23%
Student > Master 27 22%
Student > Ph. D. Student 7 6%
Other 6 5%
Student > Postgraduate 6 5%
Other 22 18%
Unknown 28 23%
Readers by discipline Count As %
Medicine and Dentistry 38 31%
Social Sciences 14 11%
Nursing and Health Professions 11 9%
Agricultural and Biological Sciences 4 3%
Business, Management and Accounting 4 3%
Other 20 16%
Unknown 33 27%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 28. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 27 June 2022.
All research outputs
#1,330,711
of 24,657,405 outputs
Outputs from Implementation Science
#236
of 1,774 outputs
Outputs of similar age
#24,441
of 350,098 outputs
Outputs of similar age from Implementation Science
#5
of 33 outputs
Altmetric has tracked 24,657,405 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 94th percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,774 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 14.9. This one has done well, scoring higher than 86% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 350,098 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 93% of its contemporaries.
We're also able to compare this research output to 33 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 87% of its contemporaries.