↓ Skip to main content

Making progress with the automation of systematic reviews: principles of the International Collaboration for the Automation of Systematic Reviews (ICASR)

Overview of attention for article published in Systematic Reviews, May 2018
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (95th percentile)

Mentioned by

blogs
1 blog
policy
1 policy source
twitter
93 tweeters

Citations

dimensions_citation
56 Dimensions

Readers on

mendeley
161 Mendeley
citeulike
1 CiteULike
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Making progress with the automation of systematic reviews: principles of the International Collaboration for the Automation of Systematic Reviews (ICASR)
Published in
Systematic Reviews, May 2018
DOI 10.1186/s13643-018-0740-7
Pubmed ID
Authors

Elaine Beller, Justin Clark, Guy Tsafnat, Clive Adams, Heinz Diehl, Hans Lund, Mourad Ouzzani, Kristina Thayer, James Thomas, Tari Turner, Jun Xia, Karen Robinson, Paul Glasziou

Abstract

Systematic reviews (SR) are vital to health care, but have become complicated and time-consuming, due to the rapid expansion of evidence to be synthesised. Fortunately, many tasks of systematic reviews have the potential to be automated or may be assisted by automation. Recent advances in natural language processing, text mining and machine learning have produced new algorithms that can accurately mimic human endeavour in systematic review activity, faster and more cheaply. Automation tools need to be able to work together, to exchange data and results. Therefore, we initiated the International Collaboration for the Automation of Systematic Reviews (ICASR), to successfully put all the parts of automation of systematic review production together. The first meeting was held in Vienna in October 2015. We established a set of principles to enable tools to be developed and integrated into toolkits.This paper sets out the principles devised at that meeting, which cover the need for improvement in efficiency of SR tasks, automation across the spectrum of SR tasks, continuous improvement, adherence to high quality standards, flexibility of use and combining components, the need for a collaboration and varied skills, the desire for open source, shared code and evaluation, and a requirement for replicability through rigorous and open evaluation.Automation has a great potential to improve the speed of systematic reviews. Considerable work is already being done on many of the steps involved in a review. The 'Vienna Principles' set out in this paper aim to guide a more coordinated effort which will allow the integration of work by separate teams and build on the experience, code and evaluations done by the many teams working across the globe.

Twitter Demographics

The data shown below were collected from the profiles of 93 tweeters who shared this research output. Click here to find out more about how the information was compiled.

Mendeley readers

The data shown below were compiled from readership statistics for 161 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 161 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 35 22%
Researcher 28 17%
Student > Master 19 12%
Student > Bachelor 14 9%
Professor 10 6%
Other 27 17%
Unknown 28 17%
Readers by discipline Count As %
Medicine and Dentistry 34 21%
Computer Science 31 19%
Nursing and Health Professions 9 6%
Agricultural and Biological Sciences 8 5%
Engineering 7 4%
Other 33 20%
Unknown 39 24%

Attention Score in Context

This research output has an Altmetric Attention Score of 64. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 17 November 2020.
All research outputs
#497,011
of 20,927,597 outputs
Outputs from Systematic Reviews
#61
of 1,819 outputs
Outputs of similar age
#13,061
of 298,414 outputs
Outputs of similar age from Systematic Reviews
#1
of 1 outputs
Altmetric has tracked 20,927,597 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 97th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,819 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.6. This one has done particularly well, scoring higher than 96% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 298,414 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 95% of its contemporaries.
We're also able to compare this research output to 1 others from the same source and published within six weeks on either side of this one. This one has scored higher than all of them