↓ Skip to main content

Abstract analysis method facilitates filtering low-methodological quality and high-bias risk systematic reviews on psoriasis interventions

Overview of attention for article published in BMC Medical Research Methodology, December 2017
Altmetric Badge

About this Attention Score

  • Good Attention Score compared to outputs of the same age (69th percentile)
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
7 X users

Citations

dimensions_citation
10 Dimensions

Readers on

mendeley
32 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Abstract analysis method facilitates filtering low-methodological quality and high-bias risk systematic reviews on psoriasis interventions
Published in
BMC Medical Research Methodology, December 2017
DOI 10.1186/s12874-017-0460-z
Pubmed ID
Authors

Francisco Gómez-García, Juan Ruano, Macarena Aguilar-Luque, Patricia Alcalde-Mellado, Jesús Gay-Mimbrera, José Luis Hernández-Romero, Juan Luis Sanz-Cabanillas, Beatriz Maestre-López, Marcelino González-Padilla, Pedro J. Carmona-Fernández, Antonio Vélez García-Nieto, Beatriz Isla-Tejera

Abstract

Article summaries' information and structure may influence researchers/clinicians' decisions to conduct deeper full-text analyses. Specifically, abstracts of systematic reviews (SRs) and meta-analyses (MA) should provide structured summaries for quick assessment. This study explored a method for determining the methodological quality and bias risk of full-text reviews using abstract information alone. Systematic literature searches for SRs and/or MA about psoriasis were undertaken on MEDLINE, EMBASE, and Cochrane database. For each review, quality, abstract-reporting completeness, full-text methodological quality, and bias risk were evaluated using Preferred Reporting Items for Systematic Reviews and Meta-analyses for abstracts (PRISMA-A), Assessing the Methodological Quality of Systematic Reviews (AMSTAR), and ROBIS tools, respectively. Article-, author-, and journal-derived metadata were systematically extracted from eligible studies using a piloted template, and explanatory variables concerning abstract-reporting quality were assessed using univariate and multivariate-regression models. Two classification models concerning SRs' methodological quality and bias risk were developed based on per-item and total PRISMA-A scores and decision-tree algorithms. This work was supported, in part, by project ICI1400136 (JR). No funding was received from any pharmaceutical company. This study analysed 139 SRs on psoriasis interventions. On average, they featured 56.7% of PRISMA-A items. The mean total PRISMA-A score was significantly higher for high-methodological-quality SRs than for moderate- and low-methodological-quality reviews. SRs with low-bias risk showed higher total PRISMA-A values than reviews with high-bias risk. In the final model, only 'authors per review > 6' (OR: 1.098; 95%CI: 1.012-1.194), 'academic source of funding' (OR: 3.630; 95%CI: 1.788-7.542), and 'PRISMA-endorsed journal' (OR: 4.370; 95%CI: 1.785-10.98) predicted PRISMA-A variability. Reviews with a total PRISMA-A score < 6, lacking identification as SR or MA in the title, and lacking explanation concerning bias risk assessment methods were classified as low-methodological quality. Abstracts with a total PRISMA-A score ≥ 9, including main outcomes results and explanation bias risk assessment method were classified as having low-bias risk. The methodological quality and bias risk of SRs may be determined by abstract's quality and completeness analyses. Our proposal aimed to facilitate synthesis of evidence evaluation by clinical professionals lacking methodological skills. External validation is necessary.

X Demographics

X Demographics

The data shown below were collected from the profiles of 7 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 32 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 32 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 8 25%
Student > Bachelor 4 13%
Student > Ph. D. Student 4 13%
Professor > Associate Professor 3 9%
Researcher 3 9%
Other 2 6%
Unknown 8 25%
Readers by discipline Count As %
Medicine and Dentistry 8 25%
Pharmacology, Toxicology and Pharmaceutical Science 2 6%
Engineering 2 6%
Computer Science 2 6%
Nursing and Health Professions 2 6%
Other 6 19%
Unknown 10 31%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 4. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 08 August 2018.
All research outputs
#6,489,701
of 23,015,156 outputs
Outputs from BMC Medical Research Methodology
#981
of 2,029 outputs
Outputs of similar age
#132,059
of 441,864 outputs
Outputs of similar age from BMC Medical Research Methodology
#27
of 52 outputs
Altmetric has tracked 23,015,156 research outputs across all sources so far. This one has received more attention than most of these and is in the 70th percentile.
So far Altmetric has tracked 2,029 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 10.2. This one has gotten more attention than average, scoring higher than 50% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 441,864 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 69% of its contemporaries.
We're also able to compare this research output to 52 others from the same source and published within six weeks on either side of this one. This one is in the 46th percentile – i.e., 46% of its contemporaries scored the same or lower than it.