ASK-RoBERTa: A pretraining model for aspect-based sentiment classification via sentiment knowledge mining

作者:

Highlights:

摘要

The main objective of aspect-based sentiment classification (ABSC) is to predict sentiment polarities of different aspects from sentences or documents. Recent research integrates sentiment terms into pretraining models whose accuracy impacts the ABSC performance. This paper introduces a sentiment knowledge-adaptive pretraining model (ASK-RoBERTa). A sentiment word dictionary is first built from general and field sentiment words. We develop a series of term and sentiment mining rules based on part-of-speech tagging and sentence dependency grammar. These mining rules consider word dependencies, compounding, and conjunctions. The pretraining model optimizes the mining rules to capture the dependency between aspects and sentiment words. Experimental results on multiple public benchmark datasets demonstrate the satisfactory performance of ASK-RoBERTa.

论文关键词:Aspect-based sentiment classification,RoBERTa,Sentiment knowledge,Dependency grammar,Knowledge mining

论文评审过程:Received 9 April 2022, Revised 19 July 2022, Accepted 20 July 2022, Available online 25 July 2022, Version of Record 5 August 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2022.109511