BERT-SMAP: Paying attention to Essential Terms in passage ranking beyond BERT
作者:
Highlights:
• We propose a hybrid ranking architecture for passage ranking, that effectively solves the problem that ranking models are easily bewildered by the overlapping but irrelevant passages.
• We propose a pooling attention mechanism called SMAP.
• SMAP is combined with a pre-trained language model to identify distracting passages.
• Approximately 5% absolute improvement has been achieved on WikiQA dataset, compared to the prior best approach based on the same pre-trained language model.
摘要
•We propose a hybrid ranking architecture for passage ranking, that effectively solves the problem that ranking models are easily bewildered by the overlapping but irrelevant passages.•We propose a pooling attention mechanism called SMAP.•SMAP is combined with a pre-trained language model to identify distracting passages.•Approximately 5% absolute improvement has been achieved on WikiQA dataset, compared to the prior best approach based on the same pre-trained language model.
论文关键词:00-01,99-00,Passage ranking,Attention mechanism,Information retrieval,Question answering,Pre-trained model
论文评审过程:Received 24 April 2021, Revised 1 October 2021, Accepted 7 October 2021, Available online 17 November 2021, Version of Record 17 November 2021.
论文官网地址:https://doi.org/10.1016/j.ipm.2021.102788