To Augment or Not to Augment? A Comparative Study on Text Augmentation Techniques for Low-Resource NLP

AbstractData-hungry deep neural networks have established themselves as the de facto standard for many NLP tasks, including the traditional sequence tagging ones. Despite their state-of-the-art performance on high-resource languages, they still fall behind their statistical counterpa...

Full description

Bibliographic Details
Main Author: Gözde Gül Şahin
Format: Article
Language:English
Published: The MIT Press 2022-01-01
Series:Computational Linguistics
Online Access:https://direct.mit.edu/coli/article/48/1/5/108844/To-Augment-or-Not-to-Augment-A-Comparative-Study