↓ Skip to main content

Making progress with the automation of systematic reviews: principles of the International Collaboration for the Automation of Systematic Reviews (ICASR)

Overview of attention for article published in Systematic Reviews, May 2018
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (95th percentile)
  • High Attention Score compared to outputs of the same age and source (97th percentile)

Mentioned by

blogs
1 blog
policy
1 policy source
twitter
89 X users

Citations

dimensions_citation
108 Dimensions

Readers on

mendeley
191 Mendeley
citeulike
1 CiteULike
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Making progress with the automation of systematic reviews: principles of the International Collaboration for the Automation of Systematic Reviews (ICASR)
Published in
Systematic Reviews, May 2018
DOI 10.1186/s13643-018-0740-7
Pubmed ID
Authors

Elaine Beller, Justin Clark, Guy Tsafnat, Clive Adams, Heinz Diehl, Hans Lund, Mourad Ouzzani, Kristina Thayer, James Thomas, Tari Turner, Jun Xia, Karen Robinson, Paul Glasziou, On behalf of the founding members of the ICASR group

Abstract

Systematic reviews (SR) are vital to health care, but have become complicated and time-consuming, due to the rapid expansion of evidence to be synthesised. Fortunately, many tasks of systematic reviews have the potential to be automated or may be assisted by automation. Recent advances in natural language processing, text mining and machine learning have produced new algorithms that can accurately mimic human endeavour in systematic review activity, faster and more cheaply. Automation tools need to be able to work together, to exchange data and results. Therefore, we initiated the International Collaboration for the Automation of Systematic Reviews (ICASR), to successfully put all the parts of automation of systematic review production together. The first meeting was held in Vienna in October 2015. We established a set of principles to enable tools to be developed and integrated into toolkits.This paper sets out the principles devised at that meeting, which cover the need for improvement in efficiency of SR tasks, automation across the spectrum of SR tasks, continuous improvement, adherence to high quality standards, flexibility of use and combining components, the need for a collaboration and varied skills, the desire for open source, shared code and evaluation, and a requirement for replicability through rigorous and open evaluation.Automation has a great potential to improve the speed of systematic reviews. Considerable work is already being done on many of the steps involved in a review. The 'Vienna Principles' set out in this paper aim to guide a more coordinated effort which will allow the integration of work by separate teams and build on the experience, code and evaluations done by the many teams working across the globe.

X Demographics

X Demographics

The data shown below were collected from the profiles of 89 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 191 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 191 100%

Demographic breakdown

Readers by professional status Count As %
Researcher 34 18%
Student > Ph. D. Student 33 17%
Student > Master 26 14%
Student > Bachelor 14 7%
Professor 13 7%
Other 34 18%
Unknown 37 19%
Readers by discipline Count As %
Medicine and Dentistry 42 22%
Computer Science 26 14%
Business, Management and Accounting 11 6%
Social Sciences 10 5%
Agricultural and Biological Sciences 10 5%
Other 42 22%
Unknown 50 26%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 62. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 12 February 2024.
All research outputs
#692,125
of 25,398,331 outputs
Outputs from Systematic Reviews
#85
of 2,231 outputs
Outputs of similar age
#15,284
of 343,815 outputs
Outputs of similar age from Systematic Reviews
#2
of 37 outputs
Altmetric has tracked 25,398,331 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 97th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 2,231 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 13.1. This one has done particularly well, scoring higher than 96% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 343,815 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 95% of its contemporaries.
We're also able to compare this research output to 37 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 97% of its contemporaries.