Temporal adaptation of BERT and performance on downstream document classification: insights from social media

Language use differs between domains and even within a domain, language use changes over time. For pre-trained language models like BERT, domain adaptation through continued pre-training has been shown to improve performance on in-domain downstream tasks. In this article, we investigate whether temp...

詳細記述

書誌詳細
主要な著者: Röttger, P, Pierrehumbert, JB
フォーマット: Conference item
言語:English
出版事項: Association for Computational Linguistics 2021