Intermediate-Task Transfer Learning with BERT for Sarcasm Detection

Sarcasm detection plays an important role in natural language processing as it can impact the performance of many applications, including sentiment analysis, opinion mining, and stance detection. Despite substantial progress on sarcasm detection, the research results are scattered across datasets an...

Full description

Bibliographic Details
Main Authors: Edoardo Savini, Cornelia Caragea
Format: Article
Language:English
Published: MDPI AG 2022-03-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/10/5/844
Description
Summary:Sarcasm detection plays an important role in natural language processing as it can impact the performance of many applications, including sentiment analysis, opinion mining, and stance detection. Despite substantial progress on sarcasm detection, the research results are scattered across datasets and studies. In this paper, we survey the current state-of-the-art and present strong baselines for sarcasm detection based on BERT pre-trained language models. We further improve our BERT models by fine-tuning them on related intermediate tasks before fine-tuning them on our target task. Specifically, relying on the correlation between sarcasm and (implied negative) sentiment and emotions, we explore a transfer learning framework that uses sentiment classification and emotion detection as individual intermediate tasks to infuse knowledge into the target task of sarcasm detection. Experimental results on three datasets that have different characteristics show that the BERT-based models outperform many previous models.
ISSN:2227-7390