Enhancing Transformer-based language models with commonsense representations for knowledge-driven machine comprehension

作者:

Highlights:

• Three injection methods are proposed to explicitly integrate commonsense into TrLMs.

• A token-level multi-hop mask mechanism is introduced to filter irrelevant knowledge.

• The incremental TrLMs achieve a 1%-4.1% improvement with fewer computational costs.

摘要

•Three injection methods are proposed to explicitly integrate commonsense into TrLMs.•A token-level multi-hop mask mechanism is introduced to filter irrelevant knowledge.•The incremental TrLMs achieve a 1%-4.1% improvement with fewer computational costs.

论文关键词:Machine Reading Comprehension,Transformer,Commonsense knowledge,Pretrained language model

论文评审过程:Received 9 November 2020, Revised 5 February 2021, Accepted 4 March 2021, Available online 6 March 2021, Version of Record 20 March 2021.

论文官网地址:https://doi.org/10.1016/j.knosys.2021.106936