↓ Skip to main content

Choice of outcomes and measurement instruments in randomised trials on eLearning in medical education: a systematic mapping review protocol

Overview of attention for article published in Systematic Reviews, May 2018
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • Good Attention Score compared to outputs of the same age (75th percentile)
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

blogs
1 blog
twitter
1 X user

Citations

dimensions_citation
16 Dimensions

Readers on

mendeley
111 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Choice of outcomes and measurement instruments in randomised trials on eLearning in medical education: a systematic mapping review protocol
Published in
Systematic Reviews, May 2018
DOI 10.1186/s13643-018-0739-0
Pubmed ID
Authors

Gloria C. Law, Christian Apfelbacher, Pawel P. Posadzki, Sandra Kemp, Lorainne Tudor Car

Abstract

There will be a lack of 18 million healthcare workers by 2030. Multiplying the number of well-trained healthcare workers through innovative ways such as eLearning is highly recommended in solving this shortage. However, high heterogeneity of learning outcomes in eLearning systematic reviews reveals a lack of consistency and agreement on core learning outcomes in eLearning for medical education. In addition, there seems to be a lack of validity evidence for measurement instruments used in these trials. This undermines the credibility of these outcome measures and affects the ability to draw accurate and meaningful conclusions. The aim of this research is to address this issue by determining the choice of outcomes, measurement instruments and the prevalence of measurement instruments with validity evidence in randomised trials on eLearning for pre-registration medical education. We will conduct a systematic mapping and review to identify the types of outcomes, the kinds of measurement instruments and the prevalence of validity evidence among measurement instruments in eLearning randomised controlled trials (RCTs) in pre-registration medical education. The search period will be from January 1990 until August 2017. We will consider studies on eLearning for health professionals' education. Two reviewers will extract and manage data independently from the included studies. Data will be analysed and synthesised according to the aim of the review. Appropriate choice of outcomes and measurement tools is essential for ensuring high-quality research in the field of eLearning and eHealth. The results of this study could have positive implications for other eHealth interventions, including (1) improving quality and credibility of eLearning research, (2) enhancing the quality of digital medical education and (3) informing researchers, academics and curriculum developers about the types of outcomes and validity evidence for measurement instruments used in eLearning studies. The protocol aspires to assist in the advancement of the eLearning research field as well as in the development of high-quality healthcare professionals' digital education. PROSPERO CRD42017068427.

X Demographics

X Demographics

The data shown below were collected from the profile of 1 X user who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 111 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 111 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 11 10%
Researcher 10 9%
Lecturer 9 8%
Student > Bachelor 9 8%
Professor > Associate Professor 8 7%
Other 29 26%
Unknown 35 32%
Readers by discipline Count As %
Medicine and Dentistry 23 21%
Computer Science 7 6%
Business, Management and Accounting 5 5%
Social Sciences 5 5%
Engineering 4 4%
Other 26 23%
Unknown 41 37%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 8. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 20 May 2018.
All research outputs
#4,038,885
of 23,058,939 outputs
Outputs from Systematic Reviews
#805
of 2,006 outputs
Outputs of similar age
#78,752
of 328,271 outputs
Outputs of similar age from Systematic Reviews
#26
of 40 outputs
Altmetric has tracked 23,058,939 research outputs across all sources so far. Compared to these this one has done well and is in the 82nd percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 2,006 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.8. This one has gotten more attention than average, scoring higher than 58% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 328,271 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 75% of its contemporaries.
We're also able to compare this research output to 40 others from the same source and published within six weeks on either side of this one. This one is in the 30th percentile – i.e., 30% of its contemporaries scored the same or lower than it.