Syntax-aware entity representations for neural relation extraction

作者:

摘要

Distantly supervised relation extraction has been widely used to find novel relational facts between entities from text, and can be easily scaled to very large corpora. Previous studies on neural relation extraction treat this task as a multi-instance learning problem, and encode the sentences in low-dimensional spaces via neural networks. Although great progress has been made, they seldom consider the information represented by entities, which are of great significance to relation extraction. In this article, we propose several methods based on different tree-based models to learn syntax-aware entity representations for neural relation extraction. First, we encode the context of entities on dependency trees as sentence-level entity embedding based on tree-structured neural network models. Then, we utilize inter-sentence attention mechanism to obtain sentence bag level entity embedding over all sentences containing the specified entity pair. Finally, we combine both sentence embedding and entity embedding for relation classification. Experimental results on a widely used real-world dataset indicate that our system performs better than the state-of-the-art systems of relation extraction.

论文关键词:Distant supervision,Relation extraction,Neural network,Syntactic structure,Attention mechanism

论文评审过程:Received 30 June 2018, Revised 14 June 2019, Accepted 16 July 2019, Available online 19 July 2019, Version of Record 8 August 2019.

论文官网地址:https://doi.org/10.1016/j.artint.2019.07.004