↓ Skip to main content

Toward a better judgment of item relevance in progress testing

Overview of attention for article published in BMC Medical Education, September 2017
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
2 X users

Citations

dimensions_citation
4 Dimensions

Readers on

mendeley
32 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Toward a better judgment of item relevance in progress testing
Published in
BMC Medical Education, September 2017
DOI 10.1186/s12909-017-0989-x
Pubmed ID
Authors

Xandra M. C. Janssen-Brandt, Arno M. M. Muijtjens, Dominique M. A. Sluijsmans

Abstract

Items must be relevant to ensure item quality and test validity. Since "item relevance" has not been operationalized yet, we developed a rubric to define it. This study explores the influence of this rubric on the assessment of item relevance and on inter-rater agreement. Members of the item review committee (RC) and students, teachers, and alumni (STA) reassessed the relevance of 50 previously used progress test (PT) items and decided about their inclusion using a 5-criteria rubric. Data were analyzed at item level using paired samples t-tests, Intraclass Correlation Coefficients (ICC), and linear regression analysis, and at rater level in a generalizability analysis per group. The proportion of items that the RC judged relevant enough to be included decreased substantially from 1.00 to 0.72 (p < 0.001). Agreement between the RC and STA was high, with an ICC of >0.7 across items. The relation between inclusion and relevance was strong (correlation = 0.89, p < 0.001), and did not differ between RC and STA. To achieve an acceptable inter-rater reliability for relevance and inclusion, 6 members must serve on the RC. Use of the rubric results in a stricter evaluation of items' appropriateness for inclusion in the PT and facilitates agreement between the RC and other stakeholders. Hence, it may help increase the acceptability and validity of the PT.

X Demographics

X Demographics

The data shown below were collected from the profiles of 2 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 32 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 32 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 5 16%
Student > Ph. D. Student 3 9%
Librarian 2 6%
Student > Doctoral Student 2 6%
Student > Bachelor 2 6%
Other 9 28%
Unknown 9 28%
Readers by discipline Count As %
Medicine and Dentistry 9 28%
Nursing and Health Professions 7 22%
Social Sciences 2 6%
Earth and Planetary Sciences 1 3%
Biochemistry, Genetics and Molecular Biology 1 3%
Other 2 6%
Unknown 10 31%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 2. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 07 September 2017.
All research outputs
#14,363,636
of 23,001,641 outputs
Outputs from BMC Medical Education
#1,979
of 3,363 outputs
Outputs of similar age
#175,496
of 315,613 outputs
Outputs of similar age from BMC Medical Education
#38
of 59 outputs
Altmetric has tracked 23,001,641 research outputs across all sources so far. This one is in the 35th percentile – i.e., 35% of other outputs scored the same or lower than it.
So far Altmetric has tracked 3,363 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 6.3. This one is in the 36th percentile – i.e., 36% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 315,613 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 41st percentile – i.e., 41% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 59 others from the same source and published within six weeks on either side of this one. This one is in the 32nd percentile – i.e., 32% of its contemporaries scored the same or lower than it.