REKA: Relation Extraction with Knowledge-Aware Attention

1Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Relation extraction (RE) is an important task and has wide applications. Distant supervision is widely used in RE methods which can automatically construct labeled data to reduce the manual annotation effort. This method usually results in many instances with incorrect labels. In addition, most of existing relation extraction methods merely rely on the textual content of sentences to extract relation. In fact, many knowledge graphs are off-the-shelf and they can provide useful information of entities and relations, which has the potential to alleviate the noisy data problem and improve the performance of relation extraction. In this paper, we propose a knowledge-aware attention model to incorporate the knowledge graph information into relation extraction. In our approach, we first learn the representations of entities and relations from knowledge graph using graph embedding methods. Then we propose a knowledge-aware word attention model to select the informative words in sentences for relation extraction. In addition, we also propose a knowledge-aware sentence attention model to select useful sentences for RE to alleviate the problem of noisy data brought by distant supervision. We conduct experiments on a widely used dataset and the results show that our approach can effectively improve the performance of neural relation extraction.

Cite

CITATION STYLE

APA

Wang, P., Liu, H., Wu, F., Song, J., Xu, H., & Wang, W. (2019). REKA: Relation Extraction with Knowledge-Aware Attention. In Communications in Computer and Information Science (Vol. 1134 CCIS, pp. 62–73). Springer. https://doi.org/10.1007/978-981-15-1956-7_6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free