Soft-self and Hard-cross Graph Attention Network for Knowledge Graph Entity Alignment
作者:
Highlights:
•
摘要
Knowledge Graph (KG) entity alignment aims to identify entities across different KGs that refer to the same real world object, and it is the key step towards KG integration and KG complement. Recently, Graph Attention Network (GAT) based models become a popular paradigm in entity alignment community owing to its ability in modeling structural data. But current GAT based models either ignore relation semantics and edge directions when learning entity neighbor representations or make no distinction between incoming neighbors and outgoing neighbors when calculating their attention scores. Furthermore, softmax functions utilized in soft attention mechanisms of current models always assign small but nonzero probabilities to trivial elements, which is unsuitable for learning alignment oriented entity embeddings. Taking these issues into account, this paper proposes a novel GAT based entity alignment model SHEA (Soft-self and Hard-cross Graph Attention Networks for Knowledge Graph Entity Alignment), which takes both relation semantics and edge directions into consideration when modeling single KG, and distinguishes prior aligned neighbors from the general ones to take full advantage of prior aligned information. Specifically, a type of four-channels graph attention layer is conceived to aggregate information from entity neighbors in different cases. The first two channels teach entities to aggregate information from their neighbors with soft-self attention, where both neighboring entities and the linked relations are used to obtain attention values. The other two channels teach entities to aggregate information from their neighbors with hard-cross graph attention, where tf_idf1 is utilized to measure the importance of entity neighbors. Extensive experiments on five publicly available datasets demonstrate our superior performances.
论文关键词:00-01,99-00,Entity alignment,Four-channels graph attention layer,Soft-self attention,Hard-cross attention
论文评审过程:Received 14 September 2020, Revised 18 August 2021, Accepted 18 August 2021, Available online 25 August 2021, Version of Record 4 September 2021.
论文官网地址:https://doi.org/10.1016/j.knosys.2021.107415