Lightweight transformers for clinical natural language processing
Specialised pre-trained language models are becoming more frequent in Natural language Processing (NLP) since they can potentially outperform models trained on generic texts. BioBERT (Sanh et al., Distilbert, a distilled version of bert: smaller, faster, cheaper and lighter. <i>arXiv preprint...
Main Authors: | , , , , , , , |
---|---|
Other Authors: | |
Format: | Journal article |
Language: | English |
Published: |
Cambridge University Press
2024
|