↓ Skip to main content

Figuring out fidelity: a worked example of the methods used to identify, critique and revise the essential elements of a contextualised intervention in health policy agencies

Overview of attention for article published in Implementation Science, February 2016
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (88th percentile)
  • Above-average Attention Score compared to outputs of the same age and source (64th percentile)

Mentioned by

policy
1 policy source
twitter
21 X users

Citations

dimensions_citation
50 Dimensions

Readers on

mendeley
155 Mendeley
citeulike
1 CiteULike
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Figuring out fidelity: a worked example of the methods used to identify, critique and revise the essential elements of a contextualised intervention in health policy agencies
Published in
Implementation Science, February 2016
DOI 10.1186/s13012-016-0378-6
Pubmed ID
Authors

Abby Haynes, Sue Brennan, Sally Redman, Anna Williamson, Gisselle Gallego, Phyllis Butow, The CIPHER team

Abstract

In this paper, we identify and respond to the fidelity assessment challenges posed by novel contextualised interventions (i.e. interventions that are informed by composite social and psychological theories and which incorporate standardised and flexible components in order to maximise effectiveness in complex settings). We (a) describe the difficulties of, and propose a method for, identifying the essential elements of a contextualised intervention; (b) provide a worked example of an approach for critiquing the validity of putative essential elements; and (c) demonstrate how essential elements can be refined during a trial without compromising the fidelity assessment. We used an exploratory test-and-refine process, drawing on empirical evidence from the process evaluation of Supporting Policy In health with Research: an Intervention Trial (SPIRIT). Mixed methods data was triangulated to identify, critique and revise how the intervention's essential elements should be articulated and scored. Over 50 provisional elements were refined to a final list of 20 and the scoring rationalised. Six (often overlapping) challenges to the validity of the essential elements were identified. They were (1) redundant-the element was not essential; (2) poorly articulated-unclear, too specific or not specific enough; (3) infeasible-it was not possible to implement the essential element as intended; (4) ineffective-the element did not effectively deliver the change principles; (5) paradoxical-counteracting vital goals or change principles; or (6) absent or suboptimal-additional or more effective ways of operationalising the theory were identified. We also identified potentially valuable 'prohibited' elements that could be used to help reduce threats to validity. We devised a method for critiquing the construct validity of our intervention's essential elements and modifying how they were articulated and measured, while simultaneously using them as fidelity indicators. This process could be used or adapted for other contextualised interventions, taking evaluators closer to making theoretically and contextually sensitive decisions upon which to base fidelity assessments.

X Demographics

X Demographics

The data shown below were collected from the profiles of 21 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 155 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 4 3%
Canada 1 <1%
Unknown 150 97%

Demographic breakdown

Readers by professional status Count As %
Researcher 33 21%
Student > Ph. D. Student 29 19%
Student > Doctoral Student 15 10%
Student > Master 12 8%
Student > Postgraduate 7 5%
Other 32 21%
Unknown 27 17%
Readers by discipline Count As %
Medicine and Dentistry 39 25%
Social Sciences 30 19%
Nursing and Health Professions 19 12%
Psychology 18 12%
Computer Science 3 2%
Other 11 7%
Unknown 35 23%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 16. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 11 December 2017.
All research outputs
#2,275,923
of 25,765,370 outputs
Outputs from Implementation Science
#444
of 1,821 outputs
Outputs of similar age
#35,074
of 314,055 outputs
Outputs of similar age from Implementation Science
#17
of 48 outputs
Altmetric has tracked 25,765,370 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 91st percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,821 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 14.9. This one has done well, scoring higher than 75% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 314,055 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 88% of its contemporaries.
We're also able to compare this research output to 48 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 64% of its contemporaries.