↓ Skip to main content

Redesigning printed educational materials for primary care physicians: design improvements increase usability

Overview of attention for article published in Implementation Science, November 2015
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • Good Attention Score compared to outputs of the same age (77th percentile)
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
11 X users

Citations

dimensions_citation
31 Dimensions

Readers on

mendeley
79 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Redesigning printed educational materials for primary care physicians: design improvements increase usability
Published in
Implementation Science, November 2015
DOI 10.1186/s13012-015-0339-5
Pubmed ID
Authors

Agnes Grudniewicz, Onil Bhattacharyya, K. Ann McKibbon, Sharon E. Straus

Abstract

Printed educational materials (PEMs) are a frequently used tool to disseminate clinical information and attempt to change behavior within primary care. However, their effect on clinician behavior is limited. In this study, we explored how PEMs can be redesigned to better meet the needs of primary care physicians (PCPs) and whether usability and selection can be increased when design principles and user preferences are used. We redesigned a publicly available PEM using physician preferences, design principles, and graphic designer support. We invited PCPs to select their preferred document between the redesigned and original versions in a discrete choice experiment, followed by an assessment of usability with the System Usability Scale and a think aloud process. We conducted this study in both a controlled and opportunistic setting to determine whether usability testing results vary by study location. Think aloud data was thematically analyzed, and results were interpreted using the Technology Acceptance Model. One hundred and eighty four PCPs participated in the discrete choice experiment at the 2014 Family Medicine Forum, a large Canadian conference for family physicians. Of these, 87.7 % preferred the redesigned version. Follow-up interviews were held with a randomly selected group of seven participants. We repeated this in a controlled setting in Toronto, Canada, with a set of 14 participants. Using the System Usability Scale, we found that usability scores were significantly increased with the redesign (p < 0.001). We also found that when PCPs were given the choice between the two versions, they selected the redesigned version as their preferred PEM more often than the original (p < 0.001). Results did not appear to differ between the opportunistic and controlled setting. We used the results of the think aloud process to add to a list of end user preferences developed in a previous study. We found that redesigning a PEM with user preferences and design principles can improve its usability and result in the PEM being selected more often than the original. We feel this finding supports the involvement of the user, application of design principles, and the assistance of a graphic designer in the development of PEMs.

X Demographics

X Demographics

The data shown below were collected from the profiles of 11 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 79 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Canada 1 1%
Unknown 78 99%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 13 16%
Student > Master 10 13%
Researcher 9 11%
Student > Bachelor 6 8%
Professor 5 6%
Other 17 22%
Unknown 19 24%
Readers by discipline Count As %
Medicine and Dentistry 11 14%
Nursing and Health Professions 9 11%
Social Sciences 8 10%
Psychology 7 9%
Computer Science 4 5%
Other 14 18%
Unknown 26 33%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 7. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 23 November 2015.
All research outputs
#4,727,893
of 23,392,375 outputs
Outputs from Implementation Science
#906
of 1,729 outputs
Outputs of similar age
#64,548
of 286,704 outputs
Outputs of similar age from Implementation Science
#23
of 35 outputs
Altmetric has tracked 23,392,375 research outputs across all sources so far. Compared to these this one has done well and is in the 79th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,729 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 14.8. This one is in the 47th percentile – i.e., 47% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 286,704 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 77% of its contemporaries.
We're also able to compare this research output to 35 others from the same source and published within six weeks on either side of this one. This one is in the 37th percentile – i.e., 37% of its contemporaries scored the same or lower than it.