↓ Skip to main content

“Scaling-out” evidence-based interventions to new populations or new health care delivery systems

Overview of attention for article published in Implementation Science, September 2017
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (94th percentile)
  • High Attention Score compared to outputs of the same age and source (87th percentile)

Mentioned by

twitter
81 X users

Citations

dimensions_citation
280 Dimensions

Readers on

mendeley
328 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
“Scaling-out” evidence-based interventions to new populations or new health care delivery systems
Published in
Implementation Science, September 2017
DOI 10.1186/s13012-017-0640-6
Pubmed ID
Authors

Gregory A. Aarons, Marisa Sklar, Brian Mustanski, Nanette Benbow, C. Hendricks Brown

Abstract

Implementing treatments and interventions with demonstrated effectiveness is critical for improving patient health outcomes at a reduced cost. When an evidence-based intervention (EBI) is implemented with fidelity in a setting that is very similar to the setting wherein it was previously found to be effective, it is reasonable to anticipate similar benefits of that EBI. However, one goal of implementation science is to expand the use of EBIs as broadly as is feasible and appropriate in order to foster the greatest public health impact. When implementing an EBI in a novel setting, or targeting novel populations, one must consider whether there is sufficient justification that the EBI would have similar benefits to those found in earlier trials. In this paper, we introduce a new concept for implementation called "scaling-out" when EBIs are adapted either to new populations or new delivery systems, or both. Using existing external validity theories and multilevel mediation modeling, we provide a logical framework for determining what new empirical evidence is required for an intervention to retain its evidence-based standard in this new context. The motivating questions are whether scale-out can reasonably be expected to produce population-level effectiveness as found in previous studies, and what additional empirical evaluations would be necessary to test for this short of an entirely new effectiveness trial. We present evaluation options for assessing whether scaling-out results in the ultimate health outcome of interest. In scaling to health or service delivery systems or population/community contexts that are different from the setting where the EBI was originally tested, there are situations where a shorter timeframe of translation is possible. We argue that implementation of an EBI in a moderately different setting or with a different population can sometimes "borrow strength" from evidence of impact in a prior effectiveness trial. The collection of additional empirical data is deemed necessary by the nature and degree of adaptations to the EBI and the context. Our argument in this paper is conceptual, and we propose formal empirical tests of mediational equivalence in a follow-up paper.

X Demographics

X Demographics

The data shown below were collected from the profiles of 81 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 328 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 328 100%

Demographic breakdown

Readers by professional status Count As %
Researcher 69 21%
Student > Ph. D. Student 46 14%
Student > Master 31 9%
Student > Doctoral Student 28 9%
Other 22 7%
Other 58 18%
Unknown 74 23%
Readers by discipline Count As %
Medicine and Dentistry 60 18%
Psychology 49 15%
Social Sciences 48 15%
Nursing and Health Professions 33 10%
Business, Management and Accounting 5 2%
Other 38 12%
Unknown 95 29%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 46. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 23 November 2021.
All research outputs
#896,378
of 25,080,267 outputs
Outputs from Implementation Science
#113
of 1,796 outputs
Outputs of similar age
#18,394
of 320,987 outputs
Outputs of similar age from Implementation Science
#5
of 31 outputs
Altmetric has tracked 25,080,267 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 96th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,796 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 14.8. This one has done particularly well, scoring higher than 93% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 320,987 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 94% of its contemporaries.
We're also able to compare this research output to 31 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 87% of its contemporaries.