↓ Skip to main content

Publishing descriptions of non-public clinical datasets: proposed guidance for researchers, repositories, editors and funding organisations

Overview of attention for article published in Research Integrity and Peer Review, June 2016
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • Among the highest-scoring outputs from this source (#42 of 128)
  • High Attention Score compared to outputs of the same age (95th percentile)
  • High Attention Score compared to outputs of the same age and source (83rd percentile)

Mentioned by

blogs
3 blogs
twitter
42 X users
facebook
1 Facebook page

Citations

dimensions_citation
17 Dimensions

Readers on

mendeley
62 Mendeley
citeulike
1 CiteULike
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Publishing descriptions of non-public clinical datasets: proposed guidance for researchers, repositories, editors and funding organisations
Published in
Research Integrity and Peer Review, June 2016
DOI 10.1186/s41073-016-0015-6
Pubmed ID
Authors

Iain Hrynaszkiewicz, Varsha Khodiyar, Andrew L. Hufton, Susanna-Assunta Sansone

Abstract

Sharing of experimental clinical research data usually happens between individuals or research groups rather than via public repositories, in part due to the need to protect research participant privacy. This approach to data sharing makes it difficult to connect journal articles with their underlying datasets and is often insufficient for ensuring access to data in the long term. Voluntary data sharing services such as the Yale Open Data Access (YODA) and Clinical Study Data Request (CSDR) projects have increased accessibility to clinical datasets for secondary uses while protecting patient privacy and the legitimacy of secondary analyses but these resources are generally disconnected from journal articles-where researchers typically search for reliable information to inform future research. New scholarly journal and article types dedicated to increasing accessibility of research data have emerged in recent years and, in general, journals are developing stronger links with data repositories. There is a need for increased collaboration between journals, data repositories, researchers, funders, and voluntary data sharing services to increase the visibility and reliability of clinical research. Using the journalScientific Dataas a case study, we propose and show examples of changes to the format and peer-review process for journal articles to more robustly link them to data that are only available on request. We also propose additional features for data repositories to better accommodate non-public clinical datasets, including Data Use Agreements (DUAs).

X Demographics

X Demographics

The data shown below were collected from the profiles of 42 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 62 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 3 5%
United States 3 5%
Mexico 1 2%
Japan 1 2%
Spain 1 2%
Unknown 53 85%

Demographic breakdown

Readers by professional status Count As %
Other 10 16%
Student > Ph. D. Student 10 16%
Researcher 10 16%
Student > Master 8 13%
Professor > Associate Professor 5 8%
Other 12 19%
Unknown 7 11%
Readers by discipline Count As %
Medicine and Dentistry 11 18%
Computer Science 8 13%
Agricultural and Biological Sciences 6 10%
Social Sciences 6 10%
Biochemistry, Genetics and Molecular Biology 3 5%
Other 18 29%
Unknown 10 16%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 47. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 19 March 2019.
All research outputs
#847,446
of 24,671,780 outputs
Outputs from Research Integrity and Peer Review
#42
of 128 outputs
Outputs of similar age
#16,690
of 359,947 outputs
Outputs of similar age from Research Integrity and Peer Review
#2
of 6 outputs
Altmetric has tracked 24,671,780 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 96th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 128 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 73.4. This one has gotten more attention than average, scoring higher than 67% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 359,947 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 95% of its contemporaries.
We're also able to compare this research output to 6 others from the same source and published within six weeks on either side of this one. This one has scored higher than 4 of them.