Application of Entity-BERT model based on neuroscience and brain-like cognition in electronic medical record entity recognition

IntroductionIn the medical field, electronic medical records contain a large amount of textual information, and the unstructured nature of this information makes data extraction and analysis challenging. Therefore, automatic extraction of entity information from electronic medical records has become...

Full description

Bibliographic Details
Main Authors: Weijia Lu, Jiehui Jiang, Yaxiang Shi, Xiaowei Zhong, Jun Gu, Lixia Huangfu, Ming Gong
Format: Article
Language:English
Published: Frontiers Media S.A. 2023-09-01
Series:Frontiers in Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fnins.2023.1259652/full
_version_ 1827811944184152064
author Weijia Lu
Weijia Lu
Jiehui Jiang
Yaxiang Shi
Xiaowei Zhong
Jun Gu
Lixia Huangfu
Ming Gong
author_facet Weijia Lu
Weijia Lu
Jiehui Jiang
Yaxiang Shi
Xiaowei Zhong
Jun Gu
Lixia Huangfu
Ming Gong
author_sort Weijia Lu
collection DOAJ
description IntroductionIn the medical field, electronic medical records contain a large amount of textual information, and the unstructured nature of this information makes data extraction and analysis challenging. Therefore, automatic extraction of entity information from electronic medical records has become a significant issue in the healthcare domain.MethodsTo address this problem, this paper proposes a deep learning-based entity information extraction model called Entity-BERT. The model aims to leverage the powerful feature extraction capabilities of deep learning and the pre-training language representation learning of BERT(Bidirectional Encoder Representations from Transformers), enabling it to automatically learn and recognize various entity types in medical electronic records, including medical terminologies, disease names, drug information, and more, providing more effective support for medical research and clinical practices. The Entity-BERT model utilizes a multi-layer neural network and cross-attention mechanism to process and fuse information at different levels and types, resembling the hierarchical and distributed processing of the human brain. Additionally, the model employs pre-trained language and sequence models to process and learn textual data, sharing similarities with the language processing and semantic understanding of the human brain. Furthermore, the Entity-BERT model can capture contextual information and long-term dependencies, combining the cross-attention mechanism to handle the complex and diverse language expressions in electronic medical records, resembling the information processing method of the human brain in many aspects. Additionally, exploring how to utilize competitive learning, adaptive regulation, and synaptic plasticity to optimize the model's prediction results, automatically adjust its parameters, and achieve adaptive learning and dynamic adjustments from the perspective of neuroscience and brain-like cognition is of interest.Results and discussionExperimental results demonstrate that the Entity-BERT model achieves outstanding performance in entity recognition tasks within electronic medical records, surpassing other existing entity recognition models. This research not only provides more efficient and accurate natural language processing technology for the medical and health field but also introduces new ideas and directions for the design and optimization of deep learning models.
first_indexed 2024-03-11T23:12:23Z
format Article
id doaj.art-52ce4f0a570442169e1d6cecc4859d47
institution Directory Open Access Journal
issn 1662-453X
language English
last_indexed 2024-03-11T23:12:23Z
publishDate 2023-09-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Neuroscience
spelling doaj.art-52ce4f0a570442169e1d6cecc4859d472023-09-21T07:20:46ZengFrontiers Media S.A.Frontiers in Neuroscience1662-453X2023-09-011710.3389/fnins.2023.12596521259652Application of Entity-BERT model based on neuroscience and brain-like cognition in electronic medical record entity recognitionWeijia Lu0Weijia Lu1Jiehui Jiang2Yaxiang Shi3Xiaowei Zhong4Jun Gu5Lixia Huangfu6Ming Gong7Science and Technology Department, Affiliated Hospital of Nantong University, Nantong, ChinaJianghai Hospital of Nantong Sutong Science and Technology Park, Nantong, ChinaDepartment of Biomedical Engineering, Shanghai University, Shanghai, ChinaNetwork Information Center, Zhongda Hospital Southeast University, Nanjing, ChinaSchool of Information and Control Engineering, China University of Mining and Technology, Xuzhou, ChinaDepartment of Respiratory, Affiliated Hospital Nantong University, Nantong, ChinaInformation Center Department, Affiliated Hospital of Nantong University, Nantong, ChinaInformation Center Department, Affiliated Hospital of Nantong University, Nantong, ChinaIntroductionIn the medical field, electronic medical records contain a large amount of textual information, and the unstructured nature of this information makes data extraction and analysis challenging. Therefore, automatic extraction of entity information from electronic medical records has become a significant issue in the healthcare domain.MethodsTo address this problem, this paper proposes a deep learning-based entity information extraction model called Entity-BERT. The model aims to leverage the powerful feature extraction capabilities of deep learning and the pre-training language representation learning of BERT(Bidirectional Encoder Representations from Transformers), enabling it to automatically learn and recognize various entity types in medical electronic records, including medical terminologies, disease names, drug information, and more, providing more effective support for medical research and clinical practices. The Entity-BERT model utilizes a multi-layer neural network and cross-attention mechanism to process and fuse information at different levels and types, resembling the hierarchical and distributed processing of the human brain. Additionally, the model employs pre-trained language and sequence models to process and learn textual data, sharing similarities with the language processing and semantic understanding of the human brain. Furthermore, the Entity-BERT model can capture contextual information and long-term dependencies, combining the cross-attention mechanism to handle the complex and diverse language expressions in electronic medical records, resembling the information processing method of the human brain in many aspects. Additionally, exploring how to utilize competitive learning, adaptive regulation, and synaptic plasticity to optimize the model's prediction results, automatically adjust its parameters, and achieve adaptive learning and dynamic adjustments from the perspective of neuroscience and brain-like cognition is of interest.Results and discussionExperimental results demonstrate that the Entity-BERT model achieves outstanding performance in entity recognition tasks within electronic medical records, surpassing other existing entity recognition models. This research not only provides more efficient and accurate natural language processing technology for the medical and health field but also introduces new ideas and directions for the design and optimization of deep learning models.https://www.frontiersin.org/articles/10.3389/fnins.2023.1259652/fullBERTLSTMcross attentionentity recognitionelectronic medical records
spellingShingle Weijia Lu
Weijia Lu
Jiehui Jiang
Yaxiang Shi
Xiaowei Zhong
Jun Gu
Lixia Huangfu
Ming Gong
Application of Entity-BERT model based on neuroscience and brain-like cognition in electronic medical record entity recognition
Frontiers in Neuroscience
BERT
LSTM
cross attention
entity recognition
electronic medical records
title Application of Entity-BERT model based on neuroscience and brain-like cognition in electronic medical record entity recognition
title_full Application of Entity-BERT model based on neuroscience and brain-like cognition in electronic medical record entity recognition
title_fullStr Application of Entity-BERT model based on neuroscience and brain-like cognition in electronic medical record entity recognition
title_full_unstemmed Application of Entity-BERT model based on neuroscience and brain-like cognition in electronic medical record entity recognition
title_short Application of Entity-BERT model based on neuroscience and brain-like cognition in electronic medical record entity recognition
title_sort application of entity bert model based on neuroscience and brain like cognition in electronic medical record entity recognition
topic BERT
LSTM
cross attention
entity recognition
electronic medical records
url https://www.frontiersin.org/articles/10.3389/fnins.2023.1259652/full
work_keys_str_mv AT weijialu applicationofentitybertmodelbasedonneuroscienceandbrainlikecognitioninelectronicmedicalrecordentityrecognition
AT weijialu applicationofentitybertmodelbasedonneuroscienceandbrainlikecognitioninelectronicmedicalrecordentityrecognition
AT jiehuijiang applicationofentitybertmodelbasedonneuroscienceandbrainlikecognitioninelectronicmedicalrecordentityrecognition
AT yaxiangshi applicationofentitybertmodelbasedonneuroscienceandbrainlikecognitioninelectronicmedicalrecordentityrecognition
AT xiaoweizhong applicationofentitybertmodelbasedonneuroscienceandbrainlikecognitioninelectronicmedicalrecordentityrecognition
AT jungu applicationofentitybertmodelbasedonneuroscienceandbrainlikecognitioninelectronicmedicalrecordentityrecognition
AT lixiahuangfu applicationofentitybertmodelbasedonneuroscienceandbrainlikecognitioninelectronicmedicalrecordentityrecognition
AT minggong applicationofentitybertmodelbasedonneuroscienceandbrainlikecognitioninelectronicmedicalrecordentityrecognition