Med-BERT: pretrained contextualized embeddings on large-scale structured electronic health records for disease prediction
Abstract Deep learning (DL)-based predictive models from electronic health records (EHRs) deliver impressive performance in many clinical tasks. Large training cohorts, however, are often required by these models to achieve high accuracy, hindering the adoption of DL-based models in scenarios with l...
Main Authors: | Laila Rasmy, Yang Xiang, Ziqian Xie, Cui Tao, Degui Zhi |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2021-05-01
|
Series: | npj Digital Medicine |
Online Access: | https://doi.org/10.1038/s41746-021-00455-y |
Similar Items
-
The classification of PubMed neurosurgical abstracts using pretrained BERT model
by: G. Danilov, et al.
Published: (2021-01-01) -
Pretrained Transformers for Text Ranking: BERT and Beyond
by: Suzan Verberne
Published: (2023-03-01) -
SAPBERT: Speaker-Aware Pretrained BERT for Emotion Recognition in Conversation
by: Seunguook Lim, et al.
Published: (2022-12-01) -
Time-sensitive clinical concept embeddings learned from large electronic health records
by: Yang Xiang, et al.
Published: (2019-04-01) -
Deep learning model for personalized prediction of positive MRSA culture using time-series electronic health records
by: Masayuki Nigo, et al.
Published: (2024-03-01)