↓ Skip to main content

Entity linking for biomedical literature

Overview of attention for article published in BMC Medical Informatics and Decision Making, May 2015
Altmetric Badge

About this Attention Score

  • Good Attention Score compared to outputs of the same age (67th percentile)
  • Above-average Attention Score compared to outputs of the same age and source (60th percentile)

Mentioned by

twitter
5 X users

Citations

dimensions_citation
32 Dimensions

Readers on

mendeley
77 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Entity linking for biomedical literature
Published in
BMC Medical Informatics and Decision Making, May 2015
DOI 10.1186/1472-6947-15-s1-s4
Pubmed ID
Authors

Jin G Zheng, Daniel Howsmon, Boliang Zhang, Juergen Hahn, Deborah McGuinness, James Hendler, Heng Ji

Abstract

The Entity Linking (EL) task links entity mentions from an unstructured document to entities in a knowledge base. Although this problem is well-studied in news and social media, this problem has not received much attention in the life science domain. One outcome of tackling the EL problem in the life sciences domain is to enable scientists to build computational models of biological processes with more efficiency. However, simply applying a news-trained entity linker produces inadequate results. Since existing supervised approaches require a large amount of manually-labeled training data, which is currently unavailable for the life science domain, we propose a novel unsupervised collective inference approach to link entities from unstructured full texts of biomedical literature to 300 ontologies. The approach leverages the rich semantic information and structures in ontologies for similarity computation and entity ranking. Without using any manual annotation, our approach significantly outperforms state-of-the-art supervised EL method (9% absolute gain in linking accuracy). Furthermore, the state-of-the-art supervised EL method requires 15,000 manually annotated entity mentions for training. These promising results establish a benchmark for the EL task in the life science domain. We also provide in depth analysis and discussion on both challenges and opportunities on automatic knowledge enrichment for scientific literature. In this paper, we propose a novel unsupervised collective inference approach to address the EL problem in a new domain. We show that our unsupervised approach is able to outperform a current state-of-the-art supervised approach that has been trained with a large amount of manually labeled data. Life science presents an underrepresented domain for applying EL techniques. By providing a small benchmark data set and identifying opportunities, we hope to stimulate discussions across natural language processing and bioinformatics and motivate others to develop techniques for this largely untapped domain.

X Demographics

X Demographics

The data shown below were collected from the profiles of 5 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 77 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 2 3%
Spain 1 1%
Unknown 74 96%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 20 26%
Student > Master 17 22%
Student > Bachelor 8 10%
Researcher 7 9%
Professor 4 5%
Other 8 10%
Unknown 13 17%
Readers by discipline Count As %
Computer Science 36 47%
Medicine and Dentistry 7 9%
Engineering 5 6%
Agricultural and Biological Sciences 5 6%
Psychology 3 4%
Other 6 8%
Unknown 15 19%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 4. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 23 May 2015.
All research outputs
#6,954,391
of 22,805,349 outputs
Outputs from BMC Medical Informatics and Decision Making
#679
of 1,988 outputs
Outputs of similar age
#82,617
of 266,611 outputs
Outputs of similar age from BMC Medical Informatics and Decision Making
#17
of 43 outputs
Altmetric has tracked 22,805,349 research outputs across all sources so far. This one has received more attention than most of these and is in the 68th percentile.
So far Altmetric has tracked 1,988 research outputs from this source. They receive a mean Attention Score of 4.9. This one has gotten more attention than average, scoring higher than 64% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 266,611 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 67% of its contemporaries.
We're also able to compare this research output to 43 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 60% of its contemporaries.