↓ Skip to main content

Uncovering students’ misconceptions by assessment of their written questions

Overview of attention for article published in BMC Medical Education, August 2016
Altmetric Badge

Mentioned by

twitter
1 X user

Citations

dimensions_citation
24 Dimensions

Readers on

mendeley
125 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Uncovering students’ misconceptions by assessment of their written questions
Published in
BMC Medical Education, August 2016
DOI 10.1186/s12909-016-0739-5
Pubmed ID
Authors

Marleen Olde Bekkink, A. R. T. Rogier Donders, Jan G. Kooloos, Rob M. W. de Waal, Dirk J. Ruiter

Abstract

Misconceptions are ideas that are inconsistent with current scientific views. They are difficult to detect and refractory to change. Misconceptions can negatively influence how new concepts in science are learned, but are rarely measured in biomedical courses. Early identification of misconceptions is of critical relevance for effective teaching, but presents a difficult task for teachers as they tend to either over- or underestimate students' prior knowledge. A systematic appreciation of the existing misconceptions is desirable. This explorative study was performed to determine whether written questions generated by students can be used to uncover their misconceptions. During a small-group work (SGW) session on Tumour Pathology in a (bio)medical bachelor course on General Pathology, students were asked to write down a question about the topic. This concerned a deepening question on disease mechanisms and not mere factual knowledge. Three independent expert pathologists determined whether the content of the questions was compatible with a misconception. Consensus was reached in all cases. Study outcomes were to determine whether misconceptions can be identified in students' written questions, and if so, to measure the frequency of misconceptions that can be encountered, and finally, to determine if the presence of such misconceptions is negatively associated with the students' course formal examination score. A subgroup analysis was performed according to gender and discipline. A total of 242 students participated in the SGW sessions, of whom 221 (91 %) formulated a question. Thirty-six questions did not meet the inclusion criteria. Of the 185 questions rated, 11 % (n = 20) was compatible with a misconception. Misconceptions were only found in medical students' questions, not in biomedical science students' questions. Formal examination score on Tumour Pathology was 5.0 (SD 2.0) in the group with misconceptions and 6.7 (SD 2.4) in the group without misconceptions (p = 0.003). This study demonstrates that misconceptions can be uncovered in students' written questions. The occurrence of these misconceptions was negatively associated with the formal examination score. Identification of misconceptions creates an opportunity to repair them during the remaining course sessions, in advance of the formal examination.

X Demographics

X Demographics

The data shown below were collected from the profile of 1 X user who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 125 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 1 <1%
Unknown 124 99%

Demographic breakdown

Readers by professional status Count As %
Student > Master 20 16%
Researcher 14 11%
Student > Bachelor 11 9%
Lecturer 9 7%
Professor > Associate Professor 6 5%
Other 26 21%
Unknown 39 31%
Readers by discipline Count As %
Medicine and Dentistry 14 11%
Social Sciences 14 11%
Biochemistry, Genetics and Molecular Biology 7 6%
Agricultural and Biological Sciences 6 5%
Physics and Astronomy 6 5%
Other 35 28%
Unknown 43 34%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 25 August 2016.
All research outputs
#21,264,673
of 23,881,329 outputs
Outputs from BMC Medical Education
#3,407
of 3,576 outputs
Outputs of similar age
#304,042
of 344,917 outputs
Outputs of similar age from BMC Medical Education
#79
of 83 outputs
Altmetric has tracked 23,881,329 research outputs across all sources so far. This one is in the 1st percentile – i.e., 1% of other outputs scored the same or lower than it.
So far Altmetric has tracked 3,576 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 6.4. This one is in the 1st percentile – i.e., 1% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 344,917 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 1st percentile – i.e., 1% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 83 others from the same source and published within six weeks on either side of this one. This one is in the 1st percentile – i.e., 1% of its contemporaries scored the same or lower than it.