Aspect-level sentiment classification based on attention-BiLSTM model and transfer learning

作者:

Highlights:

摘要

Aspect-level sentiment classification, a fine-grained sentiment analysis task which provides entire and intensive results, has been a research focus in recent years. However, the performance of neural network models is largely limited by the small scale of datasets for aspect-level sentiment classification due to the challenges to label such data. In this paper, we propose an aspect-level sentiment classification model based on Attention-Bidirectional Long Short-Term Memory (Attention-BiLSTM) model and transfer learning. Based on Attention-BiLSTM model, three models including Pre-training (PRET), Multitask learning (MTL), and Pre-training & Multitask learning (PRET+MTL) are proposed to transfer the knowledge obtained from document-level training of sentiment classification to aspect-level sentiment classification. Finally, the performance of the four models is verified on four datasets. Experiments show that proposed methods make up for the shortcomings of poor training of neural network models due to the small dataset of the aspect-level sentiment classification.

论文关键词:Sentiment classification,BiLSTM,Attention,Transfer learning

论文评审过程:Received 24 July 2021, Revised 10 March 2022, Accepted 10 March 2022, Available online 23 March 2022, Version of Record 1 April 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2022.108586