↓ Skip to main content

Evaluating Data Abstraction Assistant, a novel software application for data abstraction during systematic reviews: protocol for a randomized controlled trial

Overview of attention for article published in Systematic Reviews, November 2016
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • Good Attention Score compared to outputs of the same age (77th percentile)
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
12 X users

Citations

dimensions_citation
12 Dimensions

Readers on

mendeley
54 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Evaluating Data Abstraction Assistant, a novel software application for data abstraction during systematic reviews: protocol for a randomized controlled trial
Published in
Systematic Reviews, November 2016
DOI 10.1186/s13643-016-0373-7
Pubmed ID
Authors

Ian J. Saldanha, Christopher H. Schmid, Joseph Lau, Kay Dickersin, Jesse A. Berlin, Jens Jap, Bryant T. Smith, Simona Carini, Wiley Chan, Berry De Bruijn, Byron C. Wallace, Susan M. Hutfless, Ida Sim, M. Hassan Murad, Sandra A. Walsh, Elizabeth J. Whamond, Tianjing Li

Abstract

Data abstraction, a critical systematic review step, is time-consuming and prone to errors. Current standards for approaches to data abstraction rest on a weak evidence base. We developed the Data Abstraction Assistant (DAA), a novel software application designed to facilitate the abstraction process by allowing users to (1) view study article PDFs juxtaposed to electronic data abstraction forms linked to a data abstraction system, (2) highlight (or "pin") the location of the text in the PDF, and (3) copy relevant text from the PDF into the form. We describe the design of a randomized controlled trial (RCT) that compares the relative effectiveness of (A) DAA-facilitated single abstraction plus verification by a second person, (B) traditional (non-DAA-facilitated) single abstraction plus verification by a second person, and (C) traditional independent dual abstraction plus adjudication to ascertain the accuracy and efficiency of abstraction. This is an online, randomized, three-arm, crossover trial. We will enroll 24 pairs of abstractors (i.e., sample size is 48 participants), each pair comprising one less and one more experienced abstractor. Pairs will be randomized to abstract data from six articles, two under each of the three approaches. Abstractors will complete pre-tested data abstraction forms using the Systematic Review Data Repository (SRDR), an online data abstraction system. The primary outcomes are (1) proportion of data items abstracted that constitute an error (compared with an answer key) and (2) total time taken to complete abstraction (by two abstractors in the pair, including verification and/or adjudication). The DAA trial uses a practical design to test a novel software application as a tool to help improve the accuracy and efficiency of the data abstraction process during systematic reviews. Findings from the DAA trial will provide much-needed evidence to strengthen current recommendations for data abstraction approaches. The trial is registered at National Information Center on Health Services Research and Health Care Technology (NICHSR) under Registration # HSRP20152269: https://wwwcf.nlm.nih.gov/hsr_project/view_hsrproj_record.cfm?NLMUNIQUE_ID=20152269&SEARCH_FOR=Tianjing%20Li . All items from the World Health Organization Trial Registration Data Set are covered at various locations in this protocol. Protocol version and date: This is version 2.0 of the protocol, dated September 6, 2016. As needed, we will communicate any protocol amendments to the Institutional Review Boards (IRBs) of Johns Hopkins Bloomberg School of Public Health (JHBSPH) and Brown University. We also will make appropriate as-needed modifications to the NICHSR website in a timely fashion.

X Demographics

X Demographics

The data shown below were collected from the profiles of 12 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 54 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 1 2%
Unknown 53 98%

Demographic breakdown

Readers by professional status Count As %
Student > Master 9 17%
Student > Ph. D. Student 5 9%
Researcher 4 7%
Professor 4 7%
Student > Doctoral Student 4 7%
Other 14 26%
Unknown 14 26%
Readers by discipline Count As %
Medicine and Dentistry 11 20%
Computer Science 5 9%
Nursing and Health Professions 4 7%
Biochemistry, Genetics and Molecular Biology 3 6%
Social Sciences 3 6%
Other 9 17%
Unknown 19 35%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 7. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 01 February 2018.
All research outputs
#5,237,110
of 25,554,853 outputs
Outputs from Systematic Reviews
#1,011
of 2,239 outputs
Outputs of similar age
#93,402
of 416,517 outputs
Outputs of similar age from Systematic Reviews
#23
of 40 outputs
Altmetric has tracked 25,554,853 research outputs across all sources so far. Compared to these this one has done well and is in the 79th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 2,239 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 13.2. This one has gotten more attention than average, scoring higher than 54% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 416,517 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 77% of its contemporaries.
We're also able to compare this research output to 40 others from the same source and published within six weeks on either side of this one. This one is in the 45th percentile – i.e., 45% of its contemporaries scored the same or lower than it.