↓ Skip to main content

Reviewer training to assess knowledge translation in funding applications is long overdue

Overview of attention for article published in Research Integrity and Peer Review, August 2017
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (88th percentile)

Mentioned by

twitter
29 tweeters

Citations

dimensions_citation
2 Dimensions

Readers on

mendeley
8 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Reviewer training to assess knowledge translation in funding applications is long overdue
Published in
Research Integrity and Peer Review, August 2017
DOI 10.1186/s41073-017-0037-8
Pubmed ID
Authors

Gayle Scarrow, Donna Angus, Bev J. Holmes

Abstract

Health research funding agencies are placing a growing focus on knowledge translation (KT) plans, also known as dissemination and implementation (D&I) plans, in grant applications to decrease the gap between what we know from research and what we do in practice, policy, and further research. Historically, review panels have focused on the scientific excellence of applications to determine which should be funded; however, relevance to societal health priorities, the facilitation of evidence-informed practice and policy, or realizing commercialization opportunities all require a different lens. While experts in their respective fields, grant reviewers may lack the competencies to rigorously assess the KT components of applications. Funders of health research-including health charities, non-profit agencies, governments, and foundations-have an obligation to ensure that these components of funding applications are as rigorously evaluated as the scientific components. In this paper, we discuss the need for a more rigorous evaluation of knowledge translation potential by review panels and propose how this may be addressed. We propose that reviewer training supported in various ways including guidelines and KT expertise on review panels and modalities such as online and face-to-face training will result in the rigorous assessment of all components of funding applications, thus increasing the relevance and use of funded research evidence. An unintended but highly welcome consequence of such training could be higher quality D&I or KT plans in subsequent funding applications from trained reviewers.

Twitter Demographics

The data shown below were collected from the profiles of 29 tweeters who shared this research output. Click here to find out more about how the information was compiled.

Mendeley readers

The data shown below were compiled from readership statistics for 8 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 8 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 2 25%
Student > Doctoral Student 1 13%
Lecturer 1 13%
Professor 1 13%
Other 1 13%
Other 1 13%
Unknown 1 13%
Readers by discipline Count As %
Social Sciences 2 25%
Psychology 2 25%
Pharmacology, Toxicology and Pharmaceutical Science 1 13%
Medicine and Dentistry 1 13%
Engineering 1 13%
Other 0 0%
Unknown 1 13%

Attention Score in Context

This research output has an Altmetric Attention Score of 17. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 30 August 2017.
All research outputs
#967,671
of 13,934,958 outputs
Outputs from Research Integrity and Peer Review
#46
of 73 outputs
Outputs of similar age
#31,216
of 268,498 outputs
Outputs of similar age from Research Integrity and Peer Review
#2
of 2 outputs
Altmetric has tracked 13,934,958 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 93rd percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 73 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 35.1. This one is in the 36th percentile – i.e., 36% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 268,498 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 88% of its contemporaries.
We're also able to compare this research output to 2 others from the same source and published within six weeks on either side of this one.