↓ Skip to main content

Electronic audit and feedback intervention with action implementation toolbox to improve pain management in intensive care: protocol for a laboratory experiment and cluster randomised trial

Overview of attention for article published in Implementation Science, May 2017
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (82nd percentile)
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
16 X users

Citations

dimensions_citation
23 Dimensions

Readers on

mendeley
108 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Electronic audit and feedback intervention with action implementation toolbox to improve pain management in intensive care: protocol for a laboratory experiment and cluster randomised trial
Published in
Implementation Science, May 2017
DOI 10.1186/s13012-017-0594-8
Pubmed ID
Authors

Wouter T. Gude, Marie-José Roos-Blom, Sabine N. van der Veer, Evert de Jonge, Niels Peek, Dave A. Dongelmans, Nicolette F. de Keizer

Abstract

Audit and feedback is often used as a strategy to improve quality of care, however, its effects are variable and often marginal. In order to learn how to design and deliver effective feedback, we need to understand their mechanisms of action. This theory-informed study will investigate how electronic audit and feedback affects improvement intentions (i.e. information-intention gap), and whether an action implementation toolbox with suggested actions and materials helps translating those intentions into action (i.e. intention-behaviour gap). The study will be executed in Dutch intensive care units (ICUs) and will be focused on pain management. We will conduct a laboratory experiment with individual ICU professionals to assess the impact of feedback on their intentions to improve practice. Next, we will conduct a cluster randomised controlled trial with ICUs allocated to feedback without or feedback with action implementation toolbox group. Participants will not be told explicitly what aspect of the intervention is randomised; they will only be aware that there are two variations of providing feedback. ICUs are eligible for participation if they submit indicator data to the Dutch National Intensive Care Evaluation (NICE) quality registry and agree to allocate a quality improvement team that spends 4 h per month on the intervention. All participating ICUs will receive access to an online quality dashboard that provides two functionalities: gaining insight into clinical performance on pain management indicators and developing action plans. ICUs with access to the toolbox can develop their action plans guided by a list of potential barriers in the care process, associated suggested actions, and supporting materials to facilitate implementation of the actions. The primary outcome measure for the laboratory experiment is the proportion of improvement intentions set by participants that are consistent with recommendations based on peer comparisons; for the randomised trial it is the proportion of patient shifts during which pain has been adequately managed. We will also conduct a process evaluation to understand how the intervention is implemented and used in clinical practice, and how implementation and use affect the intervention's impact. The results of this study will inform care providers and managers in ICU and other clinical settings how to use indicator-based performance feedback in conjunction with an action implementation toolbox to improve quality of care. Within the ICU context, this study will produce concrete and directly applicable knowledge with respect to what is or is not effective for improving pain management, and under which circumstances. The results will further guide future research that aims to understand the mechanisms behind audit and feedback and contribute to identifying the active ingredients of successful interventions. ClinicalTrials.gov NCT02922101 . Registered 26 September 2016.

X Demographics

X Demographics

The data shown below were collected from the profiles of 16 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 108 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 108 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 12 11%
Other 9 8%
Researcher 9 8%
Student > Bachelor 8 7%
Lecturer 5 5%
Other 32 30%
Unknown 33 31%
Readers by discipline Count As %
Medicine and Dentistry 20 19%
Nursing and Health Professions 16 15%
Psychology 10 9%
Business, Management and Accounting 5 5%
Unspecified 4 4%
Other 15 14%
Unknown 38 35%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 11. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 04 May 2018.
All research outputs
#3,000,455
of 23,650,645 outputs
Outputs from Implementation Science
#650
of 1,730 outputs
Outputs of similar age
#55,595
of 314,640 outputs
Outputs of similar age from Implementation Science
#21
of 35 outputs
Altmetric has tracked 23,650,645 research outputs across all sources so far. Compared to these this one has done well and is in the 87th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,730 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 14.8. This one has gotten more attention than average, scoring higher than 62% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 314,640 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 82% of its contemporaries.
We're also able to compare this research output to 35 others from the same source and published within six weeks on either side of this one. This one is in the 42nd percentile – i.e., 42% of its contemporaries scored the same or lower than it.