↓ Skip to main content

Equivalence of electronic and paper administration of patient-reported outcome measures: a systematic review and meta-analysis of studies conducted between 2007 and 2013

Overview of attention for article published in Health and Quality of Life Outcomes, October 2015
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (84th percentile)
  • High Attention Score compared to outputs of the same age and source (91st percentile)

Mentioned by

twitter
16 X users

Citations

dimensions_citation
192 Dimensions

Readers on

mendeley
207 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Equivalence of electronic and paper administration of patient-reported outcome measures: a systematic review and meta-analysis of studies conducted between 2007 and 2013
Published in
Health and Quality of Life Outcomes, October 2015
DOI 10.1186/s12955-015-0362-x
Pubmed ID
Authors

Willie Muehlhausen, Helen Doll, Nuz Quadri, Bethany Fordham, Paul O’Donohoe, Nijda Dogar, Diane J. Wild

Abstract

To conduct a systematic review and meta-analysis of the equivalence between electronic and paper administration of patient reported outcome measures (PROMs) in studies conducted subsequent to those included in Gwaltney et al's 2008 review. A systematic literature review of PROM equivalence studies conducted between 2007 and 2013 identified 1,997 records from which 72 studies met pre-defined inclusion/exclusion criteria. PRO data from each study were extracted, in terms of both correlation coefficients (ICCs, Spearman and Pearson correlations, Kappa statistics) and mean differences (standardized by the standard deviation, SD, and the response scale range). Pooled estimates of correlation and mean difference were estimated. The modifying effects of mode of administration, year of publication, study design, time interval between administrations, mean age of participants and publication type were examined. Four hundred thirty-five individual correlations were extracted, these correlations being highly variable (I2 = 93.8) but showing generally good equivalence, with ICCs ranging from 0.65 to 0.99 and the pooled correlation coefficient being 0.88 (95 % CI 0.87 to 0.88). Standardised mean differences for 307 studies were small and less variable (I2 = 33.5) with a pooled standardised mean difference of 0.037 (95 % CI 0.031 to 0.042). Average administration mode/platform-specific correlations from 56 studies (61 estimates) had a pooled estimate of 0.88 (95 % CI 0.86 to 0.90) and were still highly variable (I2 = 92.1). Similarly, average platform-specific ICCs from 39 studies (42 estimates) had a pooled estimate of 0.90 (95 % CI 0.88 to 0.92) with an I2 of 91.5. After excluding 20 studies with outlying correlation coefficients (≥3SD from the mean), the I2 was 54.4, with the equivalence still high, the overall pooled correlation coefficient being 0.88 (95 % CI 0.87 to 0.88). Agreement was found to be greater in more recent studies (p < 0.001), in randomized studies compared with non-randomised studies (p < 0.001), in studies with a shorter interval (<1 day) (p < 0.001), and in respondents of mean age 28 to 55 compared with those either younger or older (p < 0.001). In terms of mode/platform, paper vs Interactive Voice Response System (IVRS) comparisons had the lowest pooled agreement and paper vs tablet/touch screen the highest (p < 0.001). The present study supports the conclusion of Gwaltney's previous meta-analysis showing that PROMs administered on paper are quantitatively comparable with measures administered on an electronic device. It also confirms the ISPOR Taskforce´s conclusion that quantitative equivalence studies are not required for migrations with minor change only. This finding should be reassuring to investigators, regulators and sponsors using questionnaires on electronic devicesafter migration using best practices. Although there is data indicating that migrations with moderate changes produce equivalent instrument versions, hence do not require quantitative equivalence studies, additional work is necessary to establish this. Furthermore, there is the need to standardize migration practices and reporting practices (i.e. include copies of tested instrument versions and screenshots) so that clear recommendations regarding equivalence testing can be made in the future.raising questions about the necessity of conducting equivalence testing moving forward.

X Demographics

X Demographics

The data shown below were collected from the profiles of 16 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 207 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Japan 1 <1%
Unknown 206 100%

Demographic breakdown

Readers by professional status Count As %
Researcher 34 16%
Student > Master 30 14%
Student > Ph. D. Student 26 13%
Student > Doctoral Student 16 8%
Other 11 5%
Other 37 18%
Unknown 53 26%
Readers by discipline Count As %
Medicine and Dentistry 63 30%
Psychology 21 10%
Nursing and Health Professions 15 7%
Social Sciences 11 5%
Computer Science 6 3%
Other 20 10%
Unknown 71 34%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 11. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 09 August 2019.
All research outputs
#3,343,929
of 25,600,774 outputs
Outputs from Health and Quality of Life Outcomes
#274
of 2,301 outputs
Outputs of similar age
#43,670
of 290,224 outputs
Outputs of similar age from Health and Quality of Life Outcomes
#5
of 45 outputs
Altmetric has tracked 25,600,774 research outputs across all sources so far. Compared to these this one has done well and is in the 86th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 2,301 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 6.5. This one has done well, scoring higher than 88% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 290,224 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 84% of its contemporaries.
We're also able to compare this research output to 45 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 91% of its contemporaries.