Incorporating Entity Type-Aware and Word–Word Relation-Aware Attention in Generative Named Entity Recognition
Named entity recognition (NER) is a critical subtask in natural language processing. It is particularly valuable to gain a deeper understanding of entity boundaries and entity types when addressing the NER problem. Most previous sequential labeling models are task-specific, while recent years have w...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2024-04-01
|
Series: | Electronics |
Subjects: | |
Online Access: | https://www.mdpi.com/2079-9292/13/7/1407 |
_version_ | 1797212702085480448 |
---|---|
author | Ying Mo Zhoujun Li |
author_facet | Ying Mo Zhoujun Li |
author_sort | Ying Mo |
collection | DOAJ |
description | Named entity recognition (NER) is a critical subtask in natural language processing. It is particularly valuable to gain a deeper understanding of entity boundaries and entity types when addressing the NER problem. Most previous sequential labeling models are task-specific, while recent years have witnessed the rise of generative models due to the advantage of tackling NER tasks in the encoder–decoder framework. Despite achieving promising performance, our pilot studies demonstrate that existing generative models are ineffective at detecting entity boundaries and estimating entity types. In this paper, a multiple attention framework is proposed which introduces the attention of entity-type embedding and word–word relation into the named entity recognition task. To improve the accuracy of entity-type mapping, we adopt an external knowledge base to calculate the prior entity-type distributions and then incorporate the information input to the model via the encoder’s self-attention. To enhance the contextual information, we take the entity types as part of the input. Our method obtains the other attention from the hidden states of entity types and utilizes it in self- and cross-attention mechanisms in the decoder. We transform the entity boundary information in the sequence into word–word relations and extract the corresponding embedding into the cross-attention mechanism. Through word–word relation information, the method can learn and understand more entity boundary information, thereby improving its entity recognition accuracy. We performed experiments on extensive NER benchmarks, including four flat and two long entity benchmarks. Our approach significantly improves or performs similarly to the best generative NER models. The experimental results demonstrate that our method can substantially enhance the capabilities of generative NER models. |
first_indexed | 2024-04-24T10:46:35Z |
format | Article |
id | doaj.art-5bde4f95a22a4348b55923b94f98ba3b |
institution | Directory Open Access Journal |
issn | 2079-9292 |
language | English |
last_indexed | 2024-04-24T10:46:35Z |
publishDate | 2024-04-01 |
publisher | MDPI AG |
record_format | Article |
series | Electronics |
spelling | doaj.art-5bde4f95a22a4348b55923b94f98ba3b2024-04-12T13:17:36ZengMDPI AGElectronics2079-92922024-04-01137140710.3390/electronics13071407Incorporating Entity Type-Aware and Word–Word Relation-Aware Attention in Generative Named Entity RecognitionYing Mo0Zhoujun Li1State Key Lab of Software Development Environment, Beihang University, Beijing 100191, ChinaState Key Lab of Software Development Environment, Beihang University, Beijing 100191, ChinaNamed entity recognition (NER) is a critical subtask in natural language processing. It is particularly valuable to gain a deeper understanding of entity boundaries and entity types when addressing the NER problem. Most previous sequential labeling models are task-specific, while recent years have witnessed the rise of generative models due to the advantage of tackling NER tasks in the encoder–decoder framework. Despite achieving promising performance, our pilot studies demonstrate that existing generative models are ineffective at detecting entity boundaries and estimating entity types. In this paper, a multiple attention framework is proposed which introduces the attention of entity-type embedding and word–word relation into the named entity recognition task. To improve the accuracy of entity-type mapping, we adopt an external knowledge base to calculate the prior entity-type distributions and then incorporate the information input to the model via the encoder’s self-attention. To enhance the contextual information, we take the entity types as part of the input. Our method obtains the other attention from the hidden states of entity types and utilizes it in self- and cross-attention mechanisms in the decoder. We transform the entity boundary information in the sequence into word–word relations and extract the corresponding embedding into the cross-attention mechanism. Through word–word relation information, the method can learn and understand more entity boundary information, thereby improving its entity recognition accuracy. We performed experiments on extensive NER benchmarks, including four flat and two long entity benchmarks. Our approach significantly improves or performs similarly to the best generative NER models. The experimental results demonstrate that our method can substantially enhance the capabilities of generative NER models.https://www.mdpi.com/2079-9292/13/7/1407named entity recognitionattentiongenerative model |
spellingShingle | Ying Mo Zhoujun Li Incorporating Entity Type-Aware and Word–Word Relation-Aware Attention in Generative Named Entity Recognition Electronics named entity recognition attention generative model |
title | Incorporating Entity Type-Aware and Word–Word Relation-Aware Attention in Generative Named Entity Recognition |
title_full | Incorporating Entity Type-Aware and Word–Word Relation-Aware Attention in Generative Named Entity Recognition |
title_fullStr | Incorporating Entity Type-Aware and Word–Word Relation-Aware Attention in Generative Named Entity Recognition |
title_full_unstemmed | Incorporating Entity Type-Aware and Word–Word Relation-Aware Attention in Generative Named Entity Recognition |
title_short | Incorporating Entity Type-Aware and Word–Word Relation-Aware Attention in Generative Named Entity Recognition |
title_sort | incorporating entity type aware and word word relation aware attention in generative named entity recognition |
topic | named entity recognition attention generative model |
url | https://www.mdpi.com/2079-9292/13/7/1407 |
work_keys_str_mv | AT yingmo incorporatingentitytypeawareandwordwordrelationawareattentioningenerativenamedentityrecognition AT zhoujunli incorporatingentitytypeawareandwordwordrelationawareattentioningenerativenamedentityrecognition |