↓ Skip to main content

Moving toward the automation of the systematic review process: a summary of discussions at the second meeting of International Collaboration for the Automation of Systematic Reviews (ICASR)

Overview of attention for article published in Systematic Reviews, January 2018
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (93rd percentile)
  • High Attention Score compared to outputs of the same age and source (85th percentile)

Mentioned by

blogs
1 blog
policy
1 policy source
twitter
41 X users

Citations

dimensions_citation
41 Dimensions

Readers on

mendeley
100 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Moving toward the automation of the systematic review process: a summary of discussions at the second meeting of International Collaboration for the Automation of Systematic Reviews (ICASR)
Published in
Systematic Reviews, January 2018
DOI 10.1186/s13643-017-0667-4
Pubmed ID
Authors

Annette M. O’Connor, Guy Tsafnat, Stephen B. Gilbert, Kristina A. Thayer, Mary S. Wolfe

Abstract

The second meeting of the International Collaboration for Automation of Systematic Reviews (ICASR) was held 3-4 October 2016 in Philadelphia, Pennsylvania, USA. ICASR is an interdisciplinary group whose aim is to maximize the use of technology for conducting rapid, accurate, and efficient systematic reviews of scientific evidence. Having automated tools for systematic review should enable more transparent and timely review, maximizing the potential for identifying and translating research findings to practical application. The meeting brought together multiple stakeholder groups including users of summarized research, methodologists who explore production processes and systematic review quality, and technologists such as software developers, statisticians, and vendors. This diversity of participants was intended to ensure effective communication with numerous stakeholders about progress toward automation of systematic reviews and stimulate discussion about potential solutions to identified challenges. The meeting highlighted challenges, both simple and complex, and raised awareness among participants about ongoing efforts by various stakeholders. An outcome of this forum was to identify several short-term projects that participants felt would advance the automation of tasks in the systematic review workflow including (1) fostering better understanding about available tools, (2) developing validated datasets for testing new tools, (3) determining a standard method to facilitate interoperability of tools such as through an application programming interface or API, and (4) establishing criteria to evaluate the quality of tools' output. ICASR 2016 provided a beneficial forum to foster focused discussion about tool development and resources and reconfirm ICASR members' commitment toward systematic reviews' automation.

X Demographics

X Demographics

The data shown below were collected from the profiles of 41 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 100 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 100 100%

Demographic breakdown

Readers by professional status Count As %
Researcher 20 20%
Student > Ph. D. Student 14 14%
Student > Master 9 9%
Other 7 7%
Professor 5 5%
Other 19 19%
Unknown 26 26%
Readers by discipline Count As %
Medicine and Dentistry 22 22%
Computer Science 13 13%
Nursing and Health Professions 7 7%
Engineering 5 5%
Psychology 4 4%
Other 18 18%
Unknown 31 31%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 31. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 16 October 2018.
All research outputs
#1,231,621
of 24,920,664 outputs
Outputs from Systematic Reviews
#175
of 2,172 outputs
Outputs of similar age
#28,876
of 455,015 outputs
Outputs of similar age from Systematic Reviews
#10
of 60 outputs
Altmetric has tracked 24,920,664 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 95th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 2,172 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 13.1. This one has done particularly well, scoring higher than 92% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 455,015 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 93% of its contemporaries.
We're also able to compare this research output to 60 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 85% of its contemporaries.