↓ Skip to main content

Using democracy to award research funding: an observational study

Overview of attention for article published in Research Integrity and Peer Review, September 2017
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • Good Attention Score compared to outputs of the same age (78th percentile)

Mentioned by

twitter
17 X users
facebook
1 Facebook page

Citations

dimensions_citation
6 Dimensions

Readers on

mendeley
9 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Using democracy to award research funding: an observational study
Published in
Research Integrity and Peer Review, September 2017
DOI 10.1186/s41073-017-0040-0
Pubmed ID
Authors

Adrian G. Barnett, Philip Clarke, Cedryck Vaquette, Nicholas Graves

Abstract

Winning funding for health and medical research usually involves a lengthy application process. With success rates under 20%, much of the time spent by 80% of applicants could have been better used on actual research. An alternative funding system that could save time is using democracy to award the most deserving researchers based on votes from the research community. We aimed to pilot how such a system could work and examine some potential biases. We used an online survey with a convenience sample of Australian researchers. Researchers were asked to name the 10 scientists currently working in Australia that they thought most deserved funding for future research. For comparison, we used recent winners from large national fellowship schemes that used traditional peer review. Voting took a median of 5 min (inter-quartile range 3 to 10 min). Extrapolating to a national voting scheme, we estimate 599 working days of voting time (95% CI 490 to 728), compared with 827 working days for the current peer review system for fellowships. The gender ratio in the votes was a more equal 45:55 (female to male) compared with 34:66 in recent fellowship winners, although this could be explained by Simpson's paradox. Voters were biased towards their own institution, with an additional 1.6 votes per ballot (inter-quartile range 0.8 to 2.2) above the expected number. Respondents raised many concerns about the idea of using democracy to fund research, including vote rigging, lobbying and it becoming a popularity contest. This is a preliminary study of using voting that does not investigate many of the concerns about how a voting system would work. We were able to show that voting would take less time than traditional peer review and would spread the workload over many more reviewers. Further studies of alternative funding systems are needed as well as a wide discussion with the research community about potential changes.

X Demographics

X Demographics

The data shown below were collected from the profiles of 17 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 9 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 9 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 2 22%
Student > Master 2 22%
Other 1 11%
Unknown 4 44%
Readers by discipline Count As %
Arts and Humanities 1 11%
Environmental Science 1 11%
Social Sciences 1 11%
Neuroscience 1 11%
Engineering 1 11%
Other 0 0%
Unknown 4 44%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 9. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 16 November 2019.
All research outputs
#4,230,667
of 25,554,853 outputs
Outputs from Research Integrity and Peer Review
#117
of 133 outputs
Outputs of similar age
#68,385
of 323,823 outputs
Outputs of similar age from Research Integrity and Peer Review
#5
of 5 outputs
Altmetric has tracked 25,554,853 research outputs across all sources so far. Compared to these this one has done well and is in the 83rd percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 133 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 76.8. This one is in the 12th percentile – i.e., 12% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 323,823 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 78% of its contemporaries.
We're also able to compare this research output to 5 others from the same source and published within six weeks on either side of this one.