↓ Skip to main content

Comparison of post-authorisation measures from regulatory authorities with additional evidence requirements from the HTA body in Germany – are additional data requirements by the Federal Joint…

Overview of attention for article published in Health Economics Review, September 2016
Altmetric Badge

Mentioned by

twitter
1 X user
facebook
1 Facebook page

Citations

dimensions_citation
7 Dimensions

Readers on

mendeley
32 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Comparison of post-authorisation measures from regulatory authorities with additional evidence requirements from the HTA body in Germany – are additional data requirements by the Federal Joint Committee justified?
Published in
Health Economics Review, September 2016
DOI 10.1186/s13561-016-0124-4
Pubmed ID
Authors

Jörg Ruof, Thomas Staab, Charalabos-Markos Dintsios, Jakob Schröter, Friedrich Wilhelm Schwartz

Abstract

The aim of this study was to compare post-authorisation measures (PAMs) from the European Medicines Agency (EMA) with data requests in fixed-termed conditional appraisals of early benefit assessments from the German Federal Joint Committee (G-BA). Medicinal products with completed benefit assessments during an assessment period of 3.5 years were considered. PAMs extracted from European Public Assessment Reports (EPARs) were compared with data requests issued by the G-BA in the context of conditional appraisals. Twenty conditional appraisals (19 products) and 34 EPARs containing PAMs (33 products) were identified. Data categories (efficacy, safety, etc.), data types (type of study required to address the request) and clarity of requests were determined. Conditional appraisals disproportionately focused on oncology products (13/19 products with conditional appraisals vs. 14/33 products with PAMs). No clear rationale for the G-BA issuing conditional appraisals could be identified in public sources. Both EMA and G-BA requested mainly efficacy and safety data (44/54 and 23/35 categories requested, respectively); however, 28/35 G-BA data requirements went beyond requests made by the EMA. Almost half of the G-BA requests (9/20), but no PAMs, were unclear, and no methodological guidance for fulfilling the data requirements was provided by the G-BA. Better alignment between data requests from regulatory authorities and health technology assessment bodies is strongly recommended.

X Demographics

X Demographics

The data shown below were collected from the profile of 1 X user who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 32 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 32 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 7 22%
Researcher 4 13%
Student > Doctoral Student 3 9%
Other 2 6%
Student > Bachelor 2 6%
Other 3 9%
Unknown 11 34%
Readers by discipline Count As %
Pharmacology, Toxicology and Pharmaceutical Science 5 16%
Medicine and Dentistry 5 16%
Computer Science 2 6%
Social Sciences 2 6%
Agricultural and Biological Sciences 1 3%
Other 4 13%
Unknown 13 41%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 29 September 2016.
All research outputs
#18,473,108
of 22,890,496 outputs
Outputs from Health Economics Review
#333
of 430 outputs
Outputs of similar age
#244,856
of 322,600 outputs
Outputs of similar age from Health Economics Review
#13
of 15 outputs
Altmetric has tracked 22,890,496 research outputs across all sources so far. This one is in the 11th percentile – i.e., 11% of other outputs scored the same or lower than it.
So far Altmetric has tracked 430 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 6.0. This one is in the 5th percentile – i.e., 5% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 322,600 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 13th percentile – i.e., 13% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 15 others from the same source and published within six weeks on either side of this one. This one is in the 1st percentile – i.e., 1% of its contemporaries scored the same or lower than it.