Transformer models for text-based emotion detection: a review of BERT-based approaches
作者:Francisca Adoma Acheampong, Henry Nunoo-Mensah, Wenyu Chen
摘要
We cannot overemphasize the essence of contextual information in most natural language processing (NLP) applications. The extraction of context yields significant improvements in many NLP tasks, including emotion recognition from texts. The paper discusses transformer-based models for NLP tasks. It highlights the pros and cons of the identified models. The models discussed include the Generative Pre-training (GPT) and its variants, Transformer-XL, Cross-lingual Language Models (XLM), and the Bidirectional Encoder Representations from Transformers (BERT). Considering BERT’s strength and popularity in text-based emotion detection, the paper discusses recent works in which researchers proposed various BERT-based models. The survey presents its contributions, results, limitations, and datasets used. We have also provided future research directions to encourage research in text-based emotion detection using these models.
论文关键词:Natural language processing, Sentiment analysis, Text-based emotion detection, Transformers
论文评审过程:
论文官网地址:https://doi.org/10.1007/s10462-021-09958-2