Lightweight transformers for clinical natural language processing

Specialised pre-trained language models are becoming more frequent in Natural language Processing (NLP) since they can potentially outperform models trained on generic texts. BioBERT (Sanh et al., Distilbert, a distilled version of bert: smaller, faster, cheaper and lighter. <i>arXiv preprint...

Full description

Bibliographic Details
Main Authors: Rohanian, O, Nouriborji, M, Jauncey, H, Kouchaki, S, Nooralahzadeh, F, Clifton, L, Merson, L, Clifton, DA
Other Authors: ISARIC Clinical Characterisation Group
Format: Journal article
Language:English
Published: Cambridge University Press 2024
_version_ 1826316799319212032
author Rohanian, O
Nouriborji, M
Jauncey, H
Kouchaki, S
Nooralahzadeh, F
Clifton, L
Merson, L
Clifton, DA
author2 ISARIC Clinical Characterisation Group
author_facet ISARIC Clinical Characterisation Group
Rohanian, O
Nouriborji, M
Jauncey, H
Kouchaki, S
Nooralahzadeh, F
Clifton, L
Merson, L
Clifton, DA
author_sort Rohanian, O
collection OXFORD
description Specialised pre-trained language models are becoming more frequent in Natural language Processing (NLP) since they can potentially outperform models trained on generic texts. BioBERT (Sanh et al., Distilbert, a distilled version of bert: smaller, faster, cheaper and lighter. <i>arXiv preprint arXiv: 1910.01108</i>, 2019) and BioClinicalBERT (Alsentzer et al., Publicly available clinical bert embeddings. In <i>Proceedings of the 2nd Clinical Natural Language Processing Workshop</i>, pp. 72–78, 2019) are two examples of such models that have shown promise in medical NLP tasks. Many of these models are overparametrised and resource-intensive, but thanks to techniques like knowledge distillation, it is possible to create smaller versions that perform almost as well as their larger counterparts. In this work, we specifically focus on development of compact language models for processing clinical texts (i.e. progress notes, discharge summaries, etc). We developed a number of efficient lightweight clinical transformers using knowledge distillation and continual learning, with the number of parameters ranging from million to million. These models performed comparably to larger models such as BioBERT and ClinicalBioBERT and significantly outperformed other compact models trained on general or biomedical data. Our extensive evaluation was done across several standard datasets and covered a wide range of clinical text-mining tasks, including natural language inference, relation extraction, named entity recognition and sequence classification. To our knowledge, this is the first comprehensive study specifically focused on creating efficient and compact transformers for clinical NLP tasks. The models and code used in this study can be found on our Huggingface profile at https://huggingface.co/nlpie and Github page at https://github.com/nlpie-research/Lightweight-Clinical-Transformers, respectively, promoting reproducibility of our results.
first_indexed 2024-04-09T03:56:07Z
format Journal article
id oxford-uuid:86fcc94c-4479-49be-84c4-04e16ce0591f
institution University of Oxford
language English
last_indexed 2025-02-19T04:28:32Z
publishDate 2024
publisher Cambridge University Press
record_format dspace
spelling oxford-uuid:86fcc94c-4479-49be-84c4-04e16ce0591f2024-12-12T17:12:22ZLightweight transformers for clinical natural language processingJournal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:86fcc94c-4479-49be-84c4-04e16ce0591fEnglishSymplectic ElementsCambridge University Press2024Rohanian, ONouriborji, MJauncey, HKouchaki, SNooralahzadeh, FClifton, LMerson, LClifton, DAISARIC Clinical Characterisation GroupSpecialised pre-trained language models are becoming more frequent in Natural language Processing (NLP) since they can potentially outperform models trained on generic texts. BioBERT (Sanh et al., Distilbert, a distilled version of bert: smaller, faster, cheaper and lighter. <i>arXiv preprint arXiv: 1910.01108</i>, 2019) and BioClinicalBERT (Alsentzer et al., Publicly available clinical bert embeddings. In <i>Proceedings of the 2nd Clinical Natural Language Processing Workshop</i>, pp. 72–78, 2019) are two examples of such models that have shown promise in medical NLP tasks. Many of these models are overparametrised and resource-intensive, but thanks to techniques like knowledge distillation, it is possible to create smaller versions that perform almost as well as their larger counterparts. In this work, we specifically focus on development of compact language models for processing clinical texts (i.e. progress notes, discharge summaries, etc). We developed a number of efficient lightweight clinical transformers using knowledge distillation and continual learning, with the number of parameters ranging from million to million. These models performed comparably to larger models such as BioBERT and ClinicalBioBERT and significantly outperformed other compact models trained on general or biomedical data. Our extensive evaluation was done across several standard datasets and covered a wide range of clinical text-mining tasks, including natural language inference, relation extraction, named entity recognition and sequence classification. To our knowledge, this is the first comprehensive study specifically focused on creating efficient and compact transformers for clinical NLP tasks. The models and code used in this study can be found on our Huggingface profile at https://huggingface.co/nlpie and Github page at https://github.com/nlpie-research/Lightweight-Clinical-Transformers, respectively, promoting reproducibility of our results.
spellingShingle Rohanian, O
Nouriborji, M
Jauncey, H
Kouchaki, S
Nooralahzadeh, F
Clifton, L
Merson, L
Clifton, DA
Lightweight transformers for clinical natural language processing
title Lightweight transformers for clinical natural language processing
title_full Lightweight transformers for clinical natural language processing
title_fullStr Lightweight transformers for clinical natural language processing
title_full_unstemmed Lightweight transformers for clinical natural language processing
title_short Lightweight transformers for clinical natural language processing
title_sort lightweight transformers for clinical natural language processing
work_keys_str_mv AT rohaniano lightweighttransformersforclinicalnaturallanguageprocessing
AT nouriborjim lightweighttransformersforclinicalnaturallanguageprocessing
AT jaunceyh lightweighttransformersforclinicalnaturallanguageprocessing
AT kouchakis lightweighttransformersforclinicalnaturallanguageprocessing
AT nooralahzadehf lightweighttransformersforclinicalnaturallanguageprocessing
AT cliftonl lightweighttransformersforclinicalnaturallanguageprocessing
AT mersonl lightweighttransformersforclinicalnaturallanguageprocessing
AT cliftonda lightweighttransformersforclinicalnaturallanguageprocessing