Temporal Effects on Pre-trained Models for Language Processing Tasks

AbstractKeeping the performance of language technologies optimal as time passes is of great practical interest. We study temporal effects on model performance on downstream language tasks, establishing a nuanced terminology for such discussion and identifying factors essential to con...

Full description

Bibliographic Details
Main Authors: Oshin Agarwal, Ani Nenkova
Format: Article
Language:English
Published: The MIT Press 2022-01-01
Series:Transactions of the Association for Computational Linguistics
Online Access:https://direct.mit.edu/tacl/article/doi/10.1162/tacl_a_00497/112912/Temporal-Effects-on-Pre-trained-Models-for