Temporal adaptation of BERT and performance on downstream document classification: insights from social media
Language use differs between domains and even within a domain, language use changes over time. For pre-trained language models like BERT, domain adaptation through continued pre-training has been shown to improve performance on in-domain downstream tasks. In this article, we investigate whether temp...
Main Authors: | , |
---|---|
Format: | Conference item |
Language: | English |
Published: |
Association for Computational Linguistics
2021
|
_version_ | 1826311974427820032 |
---|---|
author | Röttger, P Pierrehumbert, JB |
author_facet | Röttger, P Pierrehumbert, JB |
author_sort | Röttger, P |
collection | OXFORD |
description | Language use differs between domains and even within a domain, language use changes over time. For pre-trained language models like BERT, domain adaptation through continued pre-training has been shown to improve performance on in-domain downstream tasks. In this article, we investigate whether temporal adaptation can bring additional benefits. For this purpose, we introduce a corpus of social media comments sampled over three years. It contains unlabelled data for adaptation and evaluation on an upstream masked language modelling task as well as labelled data for fine-tuning and evaluation on a downstream document classification task. We find that temporality matters for both tasks: temporal adaptation improves upstream and temporal fine-tuning downstream task performance. Time-specific models generally perform better on past than on future test sets, which matches evidence on the bursty usage of topical words. However, adapting BERT to time and domain does not improve performance on the downstream task over only adapting to domain. Token-level analysis shows that temporal adaptation captures event-driven changes in language use in the downstream task, but not those changes that are actually relevant to task performance. Based on our findings, we discuss when temporal adaptation may be more effective. |
first_indexed | 2024-03-07T08:19:12Z |
format | Conference item |
id | oxford-uuid:c22a5d91-639c-4a26-903b-670ae077e3af |
institution | University of Oxford |
language | English |
last_indexed | 2024-03-07T08:19:12Z |
publishDate | 2021 |
publisher | Association for Computational Linguistics |
record_format | dspace |
spelling | oxford-uuid:c22a5d91-639c-4a26-903b-670ae077e3af2024-01-16T12:46:08ZTemporal adaptation of BERT and performance on downstream document classification: insights from social mediaConference itemhttp://purl.org/coar/resource_type/c_5794uuid:c22a5d91-639c-4a26-903b-670ae077e3afEnglishSymplectic ElementsAssociation for Computational Linguistics2021Röttger, PPierrehumbert, JBLanguage use differs between domains and even within a domain, language use changes over time. For pre-trained language models like BERT, domain adaptation through continued pre-training has been shown to improve performance on in-domain downstream tasks. In this article, we investigate whether temporal adaptation can bring additional benefits. For this purpose, we introduce a corpus of social media comments sampled over three years. It contains unlabelled data for adaptation and evaluation on an upstream masked language modelling task as well as labelled data for fine-tuning and evaluation on a downstream document classification task. We find that temporality matters for both tasks: temporal adaptation improves upstream and temporal fine-tuning downstream task performance. Time-specific models generally perform better on past than on future test sets, which matches evidence on the bursty usage of topical words. However, adapting BERT to time and domain does not improve performance on the downstream task over only adapting to domain. Token-level analysis shows that temporal adaptation captures event-driven changes in language use in the downstream task, but not those changes that are actually relevant to task performance. Based on our findings, we discuss when temporal adaptation may be more effective. |
spellingShingle | Röttger, P Pierrehumbert, JB Temporal adaptation of BERT and performance on downstream document classification: insights from social media |
title | Temporal adaptation of BERT and performance on downstream document classification: insights from social media |
title_full | Temporal adaptation of BERT and performance on downstream document classification: insights from social media |
title_fullStr | Temporal adaptation of BERT and performance on downstream document classification: insights from social media |
title_full_unstemmed | Temporal adaptation of BERT and performance on downstream document classification: insights from social media |
title_short | Temporal adaptation of BERT and performance on downstream document classification: insights from social media |
title_sort | temporal adaptation of bert and performance on downstream document classification insights from social media |
work_keys_str_mv | AT rottgerp temporaladaptationofbertandperformanceondownstreamdocumentclassificationinsightsfromsocialmedia AT pierrehumbertjb temporaladaptationofbertandperformanceondownstreamdocumentclassificationinsightsfromsocialmedia |