↓ Skip to main content

Using the Nine Common Themes of Good Practice checklist as a tool for evaluating the research priority setting process of a provincial research and program evaluation program

Overview of attention for article published in Health Research Policy and Systems, March 2016
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • Good Attention Score compared to outputs of the same age (79th percentile)
  • Above-average Attention Score compared to outputs of the same age and source (53rd percentile)

Mentioned by

twitter
11 X users

Citations

dimensions_citation
17 Dimensions

Readers on

mendeley
77 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Using the Nine Common Themes of Good Practice checklist as a tool for evaluating the research priority setting process of a provincial research and program evaluation program
Published in
Health Research Policy and Systems, March 2016
DOI 10.1186/s12961-016-0092-5
Pubmed ID
Authors

Rebecca L. Mador, Kathy Kornas, Anne Simard, Vinita Haroun

Abstract

Given the context-specific nature of health research prioritization and the obligation to effectively allocate resources to initiatives that will achieve the greatest impact, evaluation of priority setting processes can refine and strengthen such exercises and their outcomes. However, guidance is needed on evaluation tools that can be applied to research priority setting. This paper describes the adaption and application of a conceptual framework to evaluate a research priority setting exercise operating within the public health sector in Ontario, Canada. The Nine Common Themes of Good Practice checklist, described by Viergever et al. (Health Res Policy Syst 8:36, 2010) was used as the conceptual framework to evaluate the research priority setting process developed for the Locally Driven Collaborative Projects (LDCP) program in Ontario, Canada. Multiple data sources were used to inform the evaluation, including a review of selected priority setting approaches, surveys with priority setting participants, document review, and consultation with the program advisory committee. The evaluation assisted in identifying improvements to six elements of the LDCP priority setting process. The modifications were aimed at improving inclusiveness, information gathering practices, planning for project implementation, and evaluation. In addition, the findings identified that the timing of priority setting activities and level of control over the process were key factors that influenced the ability to effectively implement changes. The findings demonstrate the novel adaptation and application of the 'Nine Common Themes of Good Practice checklist' as a tool for evaluating a research priority setting exercise. The tool can guide the development of evaluation questions and enables the assessment of key constructs related to the design and delivery of a research priority setting process.

X Demographics

X Demographics

The data shown below were collected from the profiles of 11 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 77 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Indonesia 1 1%
Malaysia 1 1%
Unknown 75 97%

Demographic breakdown

Readers by professional status Count As %
Researcher 12 16%
Student > Ph. D. Student 10 13%
Student > Master 7 9%
Student > Doctoral Student 5 6%
Student > Postgraduate 5 6%
Other 14 18%
Unknown 24 31%
Readers by discipline Count As %
Medicine and Dentistry 16 21%
Social Sciences 10 13%
Nursing and Health Professions 8 10%
Business, Management and Accounting 4 5%
Psychology 4 5%
Other 9 12%
Unknown 26 34%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 7. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 11 September 2018.
All research outputs
#3,930,873
of 23,577,761 outputs
Outputs from Health Research Policy and Systems
#553
of 1,238 outputs
Outputs of similar age
#61,220
of 302,259 outputs
Outputs of similar age from Health Research Policy and Systems
#11
of 26 outputs
Altmetric has tracked 23,577,761 research outputs across all sources so far. Compared to these this one has done well and is in the 83rd percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,238 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 13.1. This one has gotten more attention than average, scoring higher than 55% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 302,259 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 79% of its contemporaries.
We're also able to compare this research output to 26 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 53% of its contemporaries.