Author Correction: A pre-trained BERT for Korean medical natural language processing
Main Authors: | Yoojoong Kim, Jong-Ho Kim, Jeong Moon Lee, Moon Joung Jang, Yun Jin Yum, Seongtae Kim, Unsub Shin, Young-Min Kim, Hyung Joon Joo, Sanghoun Song |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2023-06-01
|
Series: | Scientific Reports |
Online Access: | https://doi.org/10.1038/s41598-023-36519-0 |
Similar Items
-
A pre-trained BERT for Korean medical natural language processing
by: Yoojoong Kim, et al.
Published: (2022-08-01) -
A Word Pair Dataset for Semantic Similarity and Relatedness in Korean Medical Vocabulary: Reference Development and Validation
by: Yunjin Yum, et al.
Published: (2021-06-01) -
Investigating a neural language model’s replicability of psycholinguistic experiments: A case study of NPI licensing
by: Unsub Shin, et al.
Published: (2023-02-01) -
Standardized Database of 12-Lead Electrocardiograms with a Common Standard for the Promotion of Cardiovascular Research: KURIAS-ECG
by: Hakje Yoo, et al.
Published: (2023-04-01) -
Zero‐anaphora resolution in Korean based on deep language representation model: BERT
by: Youngtae Kim, et al.
Published: (2020-10-01)