Virtual prompt pre-training for prototype-based few-shot relation extraction
作者:
Highlights:
• Jointly pre-training an entity-relation-aware Language Model and a prompt encoder.
• A novel prompt-based prototype network, using pre-trained virtual prompts.
• Freely employ features with vocabulary-sized dimensions for classification.
摘要
•Jointly pre-training an entity-relation-aware Language Model and a prompt encoder.•A novel prompt-based prototype network, using pre-trained virtual prompts.•Freely employ features with vocabulary-sized dimensions for classification.
论文关键词:Few-shot learning,Information extraction,Prompt tuning,Pre-trained Language Model
论文评审过程:Received 18 July 2022, Revised 20 September 2022, Accepted 25 September 2022, Available online 30 September 2022, Version of Record 11 October 2022.
论文官网地址:https://doi.org/10.1016/j.eswa.2022.118927