↓ Skip to main content

Understanding Plain English summaries. A comparison of two approaches to improve the quality of Plain English summaries in research reports

Overview of attention for article published in Research Involvement and Engagement, October 2017
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (93rd percentile)
  • Above-average Attention Score compared to outputs of the same age and source (60th percentile)

Mentioned by

twitter
61 X users
facebook
1 Facebook page
wikipedia
1 Wikipedia page
reddit
1 Redditor

Citations

dimensions_citation
22 Dimensions

Readers on

mendeley
43 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Understanding Plain English summaries. A comparison of two approaches to improve the quality of Plain English summaries in research reports
Published in
Research Involvement and Engagement, October 2017
DOI 10.1186/s40900-017-0064-0
Pubmed ID
Authors

Emma Kirkpatrick, Wendy Gaisford, Elaine Williams, Elizabeth Brindley, Doreen Tembo, David Wright

Abstract

There is a need for the authors of research reports to be able to communicate their work clearly and effectively to readers who are not familiar with the research area. The National Institute for Health Research (NIHR), along with a number of other funding bodies and journals, require researchers to write short lay summaries, often termed plain English summaries (PESs), to make research accessible to the general public. Because many researchers write using technical, specialised language, particularly in scientific reports, writing PESs can be challenging. In this study we looked at how to improve the quality of PESs. We took PESs which had been submitted to the NIHR Journals Library and asked authors to rewrite them using new guidance. We also asked an independent medical writer to edit the summaries. We measured the quality of these three versions (original summary, rewritten summary and edited summary) in two ways. First, we asked a group of people who were not specialists in the subject area to read and rate how easy the summaries were to understand. Secondly, we used a well-known measure called the Flesch reading ease score to assess how easy the PESs were to read. We found that there was no difference in how easy people found the summaries to understand across the three versions. However, the PESs that were rewritten by the authors and that were edited by the independent medical writer were both easier to read than the originals. This shows that PESs can be improved and for organisations who feel that employing an independent writer to edit summaries, providing clear, practical guidance to authors may be a cost-effective alternative. Plain English summaries (PES) or lay summaries are often included as part of research reports and journal articles. These summaries are vital to ensure that research findings are accessible and available to non-specialist audiences, for example patients and members of the public. Writing a PES requires the adoption of a different style than is generally used in a traditional scientific report, and researchers can find this challenging. This study explored two possible ways to improve the quality of PESs in the NIHR Journals Library: 1) Providing enhanced guidance to authors and asking them to rewrite the PES and 2) Employing an independent medical writer to edit the PES. We compared the three versions of the PES (original, author rewritten and independent writer edited) to assess 1) how easy they were to understand and 2) how easy they were to read. In order to establish how easy PESs were to understand, a group of 60 public reviewers read a set of summaries and rated them on a four point scale from "Did not understand" to "Understood all". The Flesch reading ease score was used to measure how easy the summaries were to read. Results indicated no significant difference across the three versions of the PES in terms of ease of understanding. However, both the author rewritten and independent writer edited versions were significantly easier to read than the original. There was no significant difference in ease of reading between these two versions. These findings suggest that employing independent medical writers to edit PESs and providing clear, practical guidance to authors are two ways in which the readability of PESs could be improved. Results have implications for journal editors and publishers seeking to enhance accessibility and availability of research findings.

X Demographics

X Demographics

The data shown below were collected from the profiles of 61 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 43 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 43 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 7 16%
Researcher 5 12%
Professor > Associate Professor 5 12%
Student > Bachelor 3 7%
Professor 2 5%
Other 8 19%
Unknown 13 30%
Readers by discipline Count As %
Social Sciences 6 14%
Medicine and Dentistry 6 14%
Nursing and Health Professions 5 12%
Linguistics 3 7%
Psychology 3 7%
Other 9 21%
Unknown 11 26%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 37. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 08 March 2022.
All research outputs
#970,317
of 23,509,982 outputs
Outputs from Research Involvement and Engagement
#78
of 403 outputs
Outputs of similar age
#21,593
of 325,637 outputs
Outputs of similar age from Research Involvement and Engagement
#4
of 10 outputs
Altmetric has tracked 23,509,982 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 95th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 403 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 22.2. This one has done well, scoring higher than 80% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 325,637 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 93% of its contemporaries.
We're also able to compare this research output to 10 others from the same source and published within six weeks on either side of this one. This one has scored higher than 6 of them.