↓ Skip to main content

Development, administration, and validity evidence of a subspecialty preparatory test toward licensure: a pilot study

Overview of attention for article published in BMC Medical Education, August 2018
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age

Mentioned by

twitter
3 X users
facebook
1 Facebook page

Citations

dimensions_citation
1 Dimensions

Readers on

mendeley
22 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Development, administration, and validity evidence of a subspecialty preparatory test toward licensure: a pilot study
Published in
BMC Medical Education, August 2018
DOI 10.1186/s12909-018-1294-z
Pubmed ID
Authors

John Johnson, Alan Schwartz, Matthew Lineberry, Faisal Rehman, Yoon Soo Park

Abstract

Trainees in medical subspecialties lack validated assessment scores that can be used to prepare for their licensing examination. This paper presents the development, administration, and validity evidence of a constructed-response preparatory test (CRPT) administered to meet the needs of nephrology trainees. Learning objectives from the licensing examination were used to develop a test blueprint for the preparatory test. Messick's unified validity framework was used to gather validity evidence for content, response process, internal structure, relations to other variables, and consequences. Questionnaires were used to gather data on the trainees' perception of examination preparedness, item clarity, and curriculum adequacy. There were 10 trainees and 5 faculty volunteers who took the test. The majority of trainees passed the constructed-response preparatory test. However, many scored poorly on items assessing renal pathology and physiology knowledge. We gathered the following five sources of validity evidence: (1) Content: CRPT mapped to the licensing examination blueprint, with items demonstrating clarity and range of difficulty; (2) Response process: moderate rater agreement (intraclass correlation = .58); (3) Internal structure: sufficient reliability based on generalizability theory (G-coefficient = .76 and Φ-coefficient = .53); (4) Relations to other variables: CRPT scores reflected years of exposure in nephrology and clinical practice; (5) Consequences: post-assessment survey revealed that none of the test takers felt "poorly prepared" for the upcoming summative examination and that their studying would increase in duration and be adapted in terms of content focus. Preparatory tests using constructed response items mapped to licensure examination blueprint can be developed and used at local program settings to help prepare learners for subspecialty licensure examinations. The CRPT and questionnaire data identified shortcomings of the nephrology training program curriculum. Following the preparatory test, trainees expressed an improved sense of preparedness for their licensing examination.

X Demographics

X Demographics

The data shown below were collected from the profiles of 3 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 22 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 22 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 2 9%
Student > Master 2 9%
Student > Doctoral Student 2 9%
Lecturer 2 9%
Other 1 5%
Other 5 23%
Unknown 8 36%
Readers by discipline Count As %
Medicine and Dentistry 4 18%
Computer Science 3 14%
Nursing and Health Professions 2 9%
Social Sciences 2 9%
Biochemistry, Genetics and Molecular Biology 1 5%
Other 2 9%
Unknown 8 36%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 2. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 13 August 2018.
All research outputs
#15,015,838
of 23,098,660 outputs
Outputs from BMC Medical Education
#2,183
of 3,387 outputs
Outputs of similar age
#198,686
of 331,041 outputs
Outputs of similar age from BMC Medical Education
#50
of 71 outputs
Altmetric has tracked 23,098,660 research outputs across all sources so far. This one is in the 32nd percentile – i.e., 32% of other outputs scored the same or lower than it.
So far Altmetric has tracked 3,387 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 6.4. This one is in the 32nd percentile – i.e., 32% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 331,041 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 36th percentile – i.e., 36% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 71 others from the same source and published within six weeks on either side of this one. This one is in the 23rd percentile – i.e., 23% of its contemporaries scored the same or lower than it.