Attention-Based LSTM with Filter Mechanism for Entity Relation Classification
Relation classification is an important research area in the field of natural language processing (NLP), which aims to recognize the relationship between two tagged entities in a sentence. The noise caused by irrelevant words and the word distance between the tagged entities may affect the relation...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-10-01
|
Series: | Symmetry |
Subjects: | |
Online Access: | https://www.mdpi.com/2073-8994/12/10/1729 |
_version_ | 1797550555609956352 |
---|---|
author | Yanliang Jin Dijia Wu Weisi Guo |
author_facet | Yanliang Jin Dijia Wu Weisi Guo |
author_sort | Yanliang Jin |
collection | DOAJ |
description | Relation classification is an important research area in the field of natural language processing (NLP), which aims to recognize the relationship between two tagged entities in a sentence. The noise caused by irrelevant words and the word distance between the tagged entities may affect the relation classification accuracy. In this paper, we present a novel model multi-head attention long short term memory (LSTM) network with filter mechanism (MALNet) to extract the text features and classify the relation of two entities in a sentence. In particular, we combine LSTM with attention mechanism to obtain the shallow local information and introduce a filter layer based on attention mechanism to strength the available information. Besides, we design a semantic rule for marking the key word between the target words and construct a key word layer to extract its semantic information. We evaluated the performance of our model on SemEval-2010 Task8 dataset and KBP-37 dataset. We achieved an F1-score of 86.3% on SemEval-2010 Task8 dataset and F1-score of 61.4% on KBP-37 dataset, which shows that our method is superior to the previous state-of-the-art methods. |
first_indexed | 2024-03-10T15:31:01Z |
format | Article |
id | doaj.art-e315fae38e2d4daa86bd8a9dfc46d3a5 |
institution | Directory Open Access Journal |
issn | 2073-8994 |
language | English |
last_indexed | 2024-03-10T15:31:01Z |
publishDate | 2020-10-01 |
publisher | MDPI AG |
record_format | Article |
series | Symmetry |
spelling | doaj.art-e315fae38e2d4daa86bd8a9dfc46d3a52023-11-20T17:42:01ZengMDPI AGSymmetry2073-89942020-10-011210172910.3390/sym12101729Attention-Based LSTM with Filter Mechanism for Entity Relation ClassificationYanliang Jin0Dijia Wu1Weisi Guo2Associate Professorship with the School of Communication and Information Engineering (SCIE), Shanghai University (SHU), Shanghai 200000, ChinaSchool of Communication and Information Engineering (SCIE), Shanghai University (SHU), Shanghai 200000, ChinaSchool of Engineering, University ofWarwick, Coventry CV4 7AL, UKRelation classification is an important research area in the field of natural language processing (NLP), which aims to recognize the relationship between two tagged entities in a sentence. The noise caused by irrelevant words and the word distance between the tagged entities may affect the relation classification accuracy. In this paper, we present a novel model multi-head attention long short term memory (LSTM) network with filter mechanism (MALNet) to extract the text features and classify the relation of two entities in a sentence. In particular, we combine LSTM with attention mechanism to obtain the shallow local information and introduce a filter layer based on attention mechanism to strength the available information. Besides, we design a semantic rule for marking the key word between the target words and construct a key word layer to extract its semantic information. We evaluated the performance of our model on SemEval-2010 Task8 dataset and KBP-37 dataset. We achieved an F1-score of 86.3% on SemEval-2010 Task8 dataset and F1-score of 61.4% on KBP-37 dataset, which shows that our method is superior to the previous state-of-the-art methods.https://www.mdpi.com/2073-8994/12/10/1729relation classificationattention mechanismbidirectional LSTM networknatural language processing |
spellingShingle | Yanliang Jin Dijia Wu Weisi Guo Attention-Based LSTM with Filter Mechanism for Entity Relation Classification Symmetry relation classification attention mechanism bidirectional LSTM network natural language processing |
title | Attention-Based LSTM with Filter Mechanism for Entity Relation Classification |
title_full | Attention-Based LSTM with Filter Mechanism for Entity Relation Classification |
title_fullStr | Attention-Based LSTM with Filter Mechanism for Entity Relation Classification |
title_full_unstemmed | Attention-Based LSTM with Filter Mechanism for Entity Relation Classification |
title_short | Attention-Based LSTM with Filter Mechanism for Entity Relation Classification |
title_sort | attention based lstm with filter mechanism for entity relation classification |
topic | relation classification attention mechanism bidirectional LSTM network natural language processing |
url | https://www.mdpi.com/2073-8994/12/10/1729 |
work_keys_str_mv | AT yanliangjin attentionbasedlstmwithfiltermechanismforentityrelationclassification AT dijiawu attentionbasedlstmwithfiltermechanismforentityrelationclassification AT weisiguo attentionbasedlstmwithfiltermechanismforentityrelationclassification |