An enhanced Tree-LSTM architecture for sentence semantic modeling using typed dependencies

作者:

Highlights:

• Composing meaning of a sentence from its words is crucial in most NLP tasks.

• Type of grammatical relation between words contributes to the semantic composition.

• Proposes a relation gate, to model the relation between two consecutive LSTM units.

• Sentence modeled using the dependency parse tree structure as well as its edge type.

• Achieved improvement in semantic relatedness scoring and sentiment analysis tasks.

摘要

•Composing meaning of a sentence from its words is crucial in most NLP tasks.•Type of grammatical relation between words contributes to the semantic composition.•Proposes a relation gate, to model the relation between two consecutive LSTM units.•Sentence modeled using the dependency parse tree structure as well as its edge type.•Achieved improvement in semantic relatedness scoring and sentiment analysis tasks.

论文关键词:Sentence representation learning,Universal dependencies,Semantic relatedness scoring,Sentiment analysis

论文评审过程:Received 28 November 2019, Revised 11 June 2020, Accepted 18 July 2020, Available online 25 August 2020, Version of Record 20 October 2020.

论文官网地址:https://doi.org/10.1016/j.ipm.2020.102362