↓ Skip to main content

OpenTox predictive toxicology framework: toxicological ontology and semantic media wiki-based OpenToxipedia

Overview of attention for article published in Journal of Biomedical Semantics, January 2012
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age

Mentioned by

twitter
2 tweeters

Citations

dimensions_citation
18 Dimensions

Readers on

mendeley
70 Mendeley
citeulike
1 CiteULike
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
OpenTox predictive toxicology framework: toxicological ontology and semantic media wiki-based OpenToxipedia
Published in
Journal of Biomedical Semantics, January 2012
DOI 10.1186/2041-1480-3-s1-s7
Pubmed ID
Authors

Olga Tcheremenskaia, Romualdo Benigni, Ivelina Nikolova, Nina Jeliazkova, Sylvia E Escher, Monika Batke, Thomas Baier, Vladimir Poroikov, Alexey Lagunin, Micha Rautenberg, Barry Hardy

Abstract

The OpenTox Framework, developed by the partners in the OpenTox project (http://www.opentox.org), aims at providing a unified access to toxicity data, predictive models and validation procedures. Interoperability of resources is achieved using a common information model, based on the OpenTox ontologies, describing predictive algorithms, models and toxicity data. As toxicological data may come from different, heterogeneous sources, a deployed ontology, unifying the terminology and the resources, is critical for the rational and reliable organization of the data, and its automatic processing.

Twitter Demographics

The data shown below were collected from the profiles of 2 tweeters who shared this research output. Click here to find out more about how the information was compiled.

Mendeley readers

The data shown below were compiled from readership statistics for 70 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Bulgaria 2 3%
India 2 3%
United Kingdom 1 1%
Spain 1 1%
Japan 1 1%
Unknown 63 90%

Demographic breakdown

Readers by professional status Count As %
Researcher 21 30%
Student > Ph. D. Student 10 14%
Other 7 10%
Professor > Associate Professor 7 10%
Student > Master 6 9%
Other 14 20%
Unknown 5 7%
Readers by discipline Count As %
Computer Science 18 26%
Agricultural and Biological Sciences 12 17%
Chemistry 8 11%
Pharmacology, Toxicology and Pharmaceutical Science 6 9%
Engineering 4 6%
Other 14 20%
Unknown 8 11%

Attention Score in Context

This research output has an Altmetric Attention Score of 2. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 20 September 2021.
All research outputs
#12,010,426
of 18,927,317 outputs
Outputs from Journal of Biomedical Semantics
#202
of 340 outputs
Outputs of similar age
#80,647
of 136,574 outputs
Outputs of similar age from Journal of Biomedical Semantics
#1
of 1 outputs
Altmetric has tracked 18,927,317 research outputs across all sources so far. This one is in the 34th percentile – i.e., 34% of other outputs scored the same or lower than it.
So far Altmetric has tracked 340 research outputs from this source. They receive a mean Attention Score of 4.7. This one is in the 36th percentile – i.e., 36% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 136,574 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 38th percentile – i.e., 38% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 1 others from the same source and published within six weeks on either side of this one. This one has scored higher than all of them