Entity Structure Within and Throughout: Modeling Mention Dependencies for Document-Level Relation Extraction

153Citations
Citations of this article
123Readers
Mendeley users who have this article in their library.

Abstract

Entities, as the essential elements in relation extraction tasks, exhibit certain structure. In this work, we formulate such structure as distinctive dependencies between mention pairs. We then propose SSAN, which incorporates these structural dependencies within the standard self-attention mechanism and throughout the overall encoding stage. Specifically, we design two alternative transformation modules inside each self-attention building block to produce attentive biases so as to adaptively regularize its attention flow. Our experiments demonstrate the usefulness of the proposed entity structure and the effectiveness of SSAN. It significantly outperforms competitive baselines, achieving new state-of-the-art results on three popular document-level relation extraction datasets. We further provide ablation and visualization to show how the entity structure guides the model for better relation extraction. Our code is publicly available.

References Powered by Scopus

BioCreative V CDR task corpus: a resource for chemical disease relation extraction

675Citations
N/AReaders
Get full text

Modeling local coherence: An entity-based approach

442Citations
N/AReaders
Get full text

Reasoning with latent structure refinement for document-level relation extraction

240Citations
N/AReaders
Get full text

Cited by Powered by Scopus

A large language model for electronic health records

348Citations
N/AReaders
Get full text

Document-Level Relation Extraction with Adaptive Focal Loss and Knowledge Distillation

104Citations
N/AReaders
Get full text

WebFormer: The Web-page Transformer for Structure Information Extraction

51Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Xu, B., Wang, Q., Lyu, Y., Zhu, Y., & Mao, Z. (2021). Entity Structure Within and Throughout: Modeling Mention Dependencies for Document-Level Relation Extraction. In 35th AAAI Conference on Artificial Intelligence, AAAI 2021 (Vol. 16, pp. 14149–14157). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v35i16.17665

Readers over time

‘20‘21‘22‘23‘24‘250255075100

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 41

91%

Researcher 2

4%

Professor / Associate Prof. 1

2%

Lecturer / Post doc 1

2%

Readers' Discipline

Tooltip

Computer Science 49

89%

Engineering 4

7%

Materials Science 1

2%

Chemistry 1

2%

Save time finding and organizing research with Mendeley

Sign up for free
0