On the effectiveness of compact biomedical transformers
<p><strong>Motivation: </strong>Language models pre-trained on biomedical corpora, such as BioBERT, have recently shown promising results on downstream biomedical tasks. Many existing pre-trained models, on the other hand, are resource-intensive and computationally heavy o...
Main Authors: | , , , |
---|---|
Format: | Journal article |
Language: | English |
Published: |
Oxford University Press
2023
|