Improving sequence labeling with labeled clue sentences

作者:

Highlights:

• A general framework uses labeled clues to mitigate labeled data shortages.

• Two retrieved ways of labeled clue sentences are designed.

• Mask label strategy is devised to avoid over-fitting.

• Transformer’s self-attention is modified to exploit original and clue sentences.

• We verify effectiveness of the proposed framework on three sequence labeling tasks.

摘要

•A general framework uses labeled clues to mitigate labeled data shortages.•Two retrieved ways of labeled clue sentences are designed.•Mask label strategy is devised to avoid over-fitting.•Transformer’s self-attention is modified to exploit original and clue sentences.•We verify effectiveness of the proposed framework on three sequence labeling tasks.

论文关键词:Labeled clue sentences,Sequence labeling,Pre-trained language models

论文评审过程:Received 21 March 2022, Revised 27 August 2022, Accepted 28 August 2022, Available online 7 September 2022, Version of Record 30 September 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2022.109828