↓ Skip to main content

The validity of a professional competence tool for physiotherapy students in simulation-based clinical education: a Rasch analysis

Overview of attention for article published in BMC Medical Education, August 2016
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
2 X users

Citations

dimensions_citation
17 Dimensions

Readers on

mendeley
106 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
The validity of a professional competence tool for physiotherapy students in simulation-based clinical education: a Rasch analysis
Published in
BMC Medical Education, August 2016
DOI 10.1186/s12909-016-0718-x
Pubmed ID
Authors

Belinda K. Judd, Justin N. Scanlan, Jennifer A. Alison, Donna Waters, Christopher J. Gordon

Abstract

Despite the recent widespread adoption of simulation in clinical education in physiotherapy, there is a lack of validated tools for assessment in this setting. The Assessment of Physiotherapy Practice (APP) is a comprehensive tool used in clinical placement settings in Australia to measure professional competence of physiotherapy students. The aim of the study was to evaluate the validity of the APP for student assessment in simulation settings. A total of 1260 APPs were collected, 971 from students in simulation and 289 from students in clinical placements. Rasch analysis was used to examine the construct validity of the APP tool in three different simulation assessment formats: longitudinal assessment over 1 week of simulation; longitudinal assessment over 2 weeks; and a short-form (25 min) assessment of a single simulation scenario. Comparison with APPs from 5 week clinical placements in hospital and clinic-based settings were also conducted. The APP demonstrated acceptable fit to the expectations of the Rasch model for the 1 and 2 week clinical simulations, exhibiting unidimensional properties that were able to distinguish different levels of student performance. For the short-form simulation, nine of the 20 items recorded greater than 25 % of scores as 'not-assessed' by clinical educators which impacted on the suitability of the APP tool in this simulation format. The APP was a valid assessment tool when used in longitudinal simulation formats. A revised APP may be required for assessment in short-form simulation scenarios.

X Demographics

X Demographics

The data shown below were collected from the profiles of 2 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 106 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 106 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 13 12%
Student > Bachelor 12 11%
Professor > Associate Professor 7 7%
Professor 7 7%
Lecturer 6 6%
Other 27 25%
Unknown 34 32%
Readers by discipline Count As %
Nursing and Health Professions 37 35%
Medicine and Dentistry 16 15%
Psychology 2 2%
Sports and Recreations 2 2%
Social Sciences 2 2%
Other 10 9%
Unknown 37 35%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 2. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 09 August 2016.
All research outputs
#14,268,952
of 22,882,389 outputs
Outputs from BMC Medical Education
#1,967
of 3,338 outputs
Outputs of similar age
#216,181
of 366,897 outputs
Outputs of similar age from BMC Medical Education
#45
of 69 outputs
Altmetric has tracked 22,882,389 research outputs across all sources so far. This one is in the 35th percentile – i.e., 35% of other outputs scored the same or lower than it.
So far Altmetric has tracked 3,338 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 6.3. This one is in the 36th percentile – i.e., 36% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 366,897 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 38th percentile – i.e., 38% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 69 others from the same source and published within six weeks on either side of this one. This one is in the 30th percentile – i.e., 30% of its contemporaries scored the same or lower than it.