Sentiment Lexical Strength Enhanced Self-supervised Attention Learning for sentiment analysis

作者:

Highlights:

摘要

In Natural Language Processing (NLP), attention mechanism is often used to quantify the importance of the context word in sentiment prediction. However, it tends to focus on high-frequency words, while ignoring low-frequency words that have an active effect in some positions. In this paper, we propose a Sentiment Lexical Strength Enhanced Self-supervised Attention Learning (SLS-ESAL) approach. Specifically, we iteratively mine attention supervision information from all input sentences. Then we use weights quantified by sentiment lexical strength to enhance attention learning in final training, which enables our model to continue to focus on the active context words in different positions and eliminate the effects of the misleading context ones. Experiments on three datasets show that our approach can improve sentiment analysis performance and verify attention weights can be used as an explanation for text classification.

论文关键词:Sentiment lexical strength,Attention supervision information,Sentiment analysis

论文评审过程:Received 7 February 2022, Revised 23 June 2022, Accepted 24 June 2022, Available online 28 June 2022, Version of Record 8 July 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2022.109335