A Joint Learning Model to Extract Entities and Relations for Chinese Literature Based on Self-Attention
Extracting structured information from massive and heterogeneous text is a hot research topic in the field of natural language processing. It includes two key technologies: named entity recognition (NER) and relation extraction (RE). However, previous NER models consider less about the influence of...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-06-01
|
Series: | Mathematics |
Subjects: | |
Online Access: | https://www.mdpi.com/2227-7390/10/13/2216 |
_version_ | 1797408664905056256 |
---|---|
author | Li-Xin Liang Lin Lin E Lin Wu-Shao Wen Guo-Yan Huang |
author_facet | Li-Xin Liang Lin Lin E Lin Wu-Shao Wen Guo-Yan Huang |
author_sort | Li-Xin Liang |
collection | DOAJ |
description | Extracting structured information from massive and heterogeneous text is a hot research topic in the field of natural language processing. It includes two key technologies: named entity recognition (NER) and relation extraction (RE). However, previous NER models consider less about the influence of mutual attention between words in the text on the prediction of entity labels, and there is less research on how to more fully extract sentence information for relational classification. In addition, previous research treats NER and RE as a pipeline of two separated tasks, which neglects the connection between them, and is mainly focused on the English corpus. In this paper, based on the self-attention mechanism, bidirectional long short-term memory (BiLSTM) neural network and conditional random field (CRF) model, we put forth a Chinese NER method based on BiLSTM-Self-Attention-CRF and a RE method based on BiLSTM-Multilevel-Attention in the field of Chinese literature. In particular, considering the relationship between these two tasks in terms of word vector and context feature representation in the neural network model, we put forth a joint learning method for NER and RE tasks based on the same underlying module, which jointly updates the parameters of the shared module during the training of these two tasks. For performance evaluation, we make use of the largest Chinese data set containing these two tasks. Experimental results show that the proposed independently trained NER and RE models achieve better performance than all previous methods, and our joint NER-RE training model outperforms the independently-trained NER and RE model. |
first_indexed | 2024-03-09T04:02:37Z |
format | Article |
id | doaj.art-82d1d96136cb4c07aec742ed47f0de5b |
institution | Directory Open Access Journal |
issn | 2227-7390 |
language | English |
last_indexed | 2024-03-09T04:02:37Z |
publishDate | 2022-06-01 |
publisher | MDPI AG |
record_format | Article |
series | Mathematics |
spelling | doaj.art-82d1d96136cb4c07aec742ed47f0de5b2023-12-03T14:11:49ZengMDPI AGMathematics2227-73902022-06-011013221610.3390/math10132216A Joint Learning Model to Extract Entities and Relations for Chinese Literature Based on Self-AttentionLi-Xin Liang0Lin Lin1E Lin2Wu-Shao Wen3Guo-Yan Huang4College of Big Data and Internet, Shenzhen Technology University, Shenzhen 518118, ChinaCollege of Big Data and Internet, Shenzhen Technology University, Shenzhen 518118, ChinaSchool of Computer Science and Engineering, Sun Yat-Sen University, Guangzhou 510006, ChinaSchool of Computer Science and Engineering, Sun Yat-Sen University, Guangzhou 510006, ChinaSchool of Computer Science and Engineering, Sun Yat-Sen University, Guangzhou 510006, ChinaExtracting structured information from massive and heterogeneous text is a hot research topic in the field of natural language processing. It includes two key technologies: named entity recognition (NER) and relation extraction (RE). However, previous NER models consider less about the influence of mutual attention between words in the text on the prediction of entity labels, and there is less research on how to more fully extract sentence information for relational classification. In addition, previous research treats NER and RE as a pipeline of two separated tasks, which neglects the connection between them, and is mainly focused on the English corpus. In this paper, based on the self-attention mechanism, bidirectional long short-term memory (BiLSTM) neural network and conditional random field (CRF) model, we put forth a Chinese NER method based on BiLSTM-Self-Attention-CRF and a RE method based on BiLSTM-Multilevel-Attention in the field of Chinese literature. In particular, considering the relationship between these two tasks in terms of word vector and context feature representation in the neural network model, we put forth a joint learning method for NER and RE tasks based on the same underlying module, which jointly updates the parameters of the shared module during the training of these two tasks. For performance evaluation, we make use of the largest Chinese data set containing these two tasks. Experimental results show that the proposed independently trained NER and RE models achieve better performance than all previous methods, and our joint NER-RE training model outperforms the independently-trained NER and RE model.https://www.mdpi.com/2227-7390/10/13/2216Chinese named entity recognitionrelation extractionjoint learninglong short-term memory neural networkself-attention mechanism |
spellingShingle | Li-Xin Liang Lin Lin E Lin Wu-Shao Wen Guo-Yan Huang A Joint Learning Model to Extract Entities and Relations for Chinese Literature Based on Self-Attention Mathematics Chinese named entity recognition relation extraction joint learning long short-term memory neural network self-attention mechanism |
title | A Joint Learning Model to Extract Entities and Relations for Chinese Literature Based on Self-Attention |
title_full | A Joint Learning Model to Extract Entities and Relations for Chinese Literature Based on Self-Attention |
title_fullStr | A Joint Learning Model to Extract Entities and Relations for Chinese Literature Based on Self-Attention |
title_full_unstemmed | A Joint Learning Model to Extract Entities and Relations for Chinese Literature Based on Self-Attention |
title_short | A Joint Learning Model to Extract Entities and Relations for Chinese Literature Based on Self-Attention |
title_sort | joint learning model to extract entities and relations for chinese literature based on self attention |
topic | Chinese named entity recognition relation extraction joint learning long short-term memory neural network self-attention mechanism |
url | https://www.mdpi.com/2227-7390/10/13/2216 |
work_keys_str_mv | AT lixinliang ajointlearningmodeltoextractentitiesandrelationsforchineseliteraturebasedonselfattention AT linlin ajointlearningmodeltoextractentitiesandrelationsforchineseliteraturebasedonselfattention AT elin ajointlearningmodeltoextractentitiesandrelationsforchineseliteraturebasedonselfattention AT wushaowen ajointlearningmodeltoextractentitiesandrelationsforchineseliteraturebasedonselfattention AT guoyanhuang ajointlearningmodeltoextractentitiesandrelationsforchineseliteraturebasedonselfattention AT lixinliang jointlearningmodeltoextractentitiesandrelationsforchineseliteraturebasedonselfattention AT linlin jointlearningmodeltoextractentitiesandrelationsforchineseliteraturebasedonselfattention AT elin jointlearningmodeltoextractentitiesandrelationsforchineseliteraturebasedonselfattention AT wushaowen jointlearningmodeltoextractentitiesandrelationsforchineseliteraturebasedonselfattention AT guoyanhuang jointlearningmodeltoextractentitiesandrelationsforchineseliteraturebasedonselfattention |