FA-RCNet: A Fused Feature Attention Network for Relationship Classification
Relation extraction is an important task in natural language processing. It plays an integral role in intelligent question-and-answer systems, semantic search, and knowledge graph work. For this task, previous studies have demonstrated the effectiveness of convolutional neural networks (CNNs), recur...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-12-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/12/23/12460 |
_version_ | 1797463571851902976 |
---|---|
author | Jiakai Tian Gang Li Mingle Zhou Min Li Delong Han |
author_facet | Jiakai Tian Gang Li Mingle Zhou Min Li Delong Han |
author_sort | Jiakai Tian |
collection | DOAJ |
description | Relation extraction is an important task in natural language processing. It plays an integral role in intelligent question-and-answer systems, semantic search, and knowledge graph work. For this task, previous studies have demonstrated the effectiveness of convolutional neural networks (CNNs), recurrent neural networks (RNNs), and long short-term memory networks (LSTMs) in relational classification tasks. Recently, due to the superior performance of the pre-trained model BERT, BERT has become a feature extraction module for many relational classification models, and good results have been achieved in work related to BERT. However, most of such work uses the deepest levels of features. The important role of shallow-level information in the relational classification task is ignored. Based on the above problems, a relationship classification network FA-RCNet (fusion-attention relationship classification network) with feature fusion and attention mechanism is proposed in this paper. FA-RCNet fuses shallow-level features with deep-level features, and augments entity features and global features by the attention module so that the feature vector can perform the relational classification task more perfectly. In addition, the model in this paper achieves advanced results on both the SemEval-2010 Task 8 dataset and the KBP37 dataset compared to previously published models. |
first_indexed | 2024-03-09T17:52:39Z |
format | Article |
id | doaj.art-bbf7059cf83f4238b2dc63989777a543 |
institution | Directory Open Access Journal |
issn | 2076-3417 |
language | English |
last_indexed | 2024-03-09T17:52:39Z |
publishDate | 2022-12-01 |
publisher | MDPI AG |
record_format | Article |
series | Applied Sciences |
spelling | doaj.art-bbf7059cf83f4238b2dc63989777a5432023-11-24T10:36:52ZengMDPI AGApplied Sciences2076-34172022-12-0112231246010.3390/app122312460FA-RCNet: A Fused Feature Attention Network for Relationship ClassificationJiakai Tian0Gang Li1Mingle Zhou2Min Li3Delong Han4Shandong Computer Science Center (National Supercomputer Center in Jinan), Qilu University of Technology (Shandong Academy of Sciences), Jinan 250316, ChinaShandong Computer Science Center (National Supercomputer Center in Jinan), Qilu University of Technology (Shandong Academy of Sciences), Jinan 250316, ChinaShandong Computer Science Center (National Supercomputer Center in Jinan), Qilu University of Technology (Shandong Academy of Sciences), Jinan 250316, ChinaShandong Computer Science Center (National Supercomputer Center in Jinan), Qilu University of Technology (Shandong Academy of Sciences), Jinan 250316, ChinaShandong Computer Science Center (National Supercomputer Center in Jinan), Qilu University of Technology (Shandong Academy of Sciences), Jinan 250316, ChinaRelation extraction is an important task in natural language processing. It plays an integral role in intelligent question-and-answer systems, semantic search, and knowledge graph work. For this task, previous studies have demonstrated the effectiveness of convolutional neural networks (CNNs), recurrent neural networks (RNNs), and long short-term memory networks (LSTMs) in relational classification tasks. Recently, due to the superior performance of the pre-trained model BERT, BERT has become a feature extraction module for many relational classification models, and good results have been achieved in work related to BERT. However, most of such work uses the deepest levels of features. The important role of shallow-level information in the relational classification task is ignored. Based on the above problems, a relationship classification network FA-RCNet (fusion-attention relationship classification network) with feature fusion and attention mechanism is proposed in this paper. FA-RCNet fuses shallow-level features with deep-level features, and augments entity features and global features by the attention module so that the feature vector can perform the relational classification task more perfectly. In addition, the model in this paper achieves advanced results on both the SemEval-2010 Task 8 dataset and the KBP37 dataset compared to previously published models.https://www.mdpi.com/2076-3417/12/23/12460relationship classificationattentional mechanismsfeature fusion |
spellingShingle | Jiakai Tian Gang Li Mingle Zhou Min Li Delong Han FA-RCNet: A Fused Feature Attention Network for Relationship Classification Applied Sciences relationship classification attentional mechanisms feature fusion |
title | FA-RCNet: A Fused Feature Attention Network for Relationship Classification |
title_full | FA-RCNet: A Fused Feature Attention Network for Relationship Classification |
title_fullStr | FA-RCNet: A Fused Feature Attention Network for Relationship Classification |
title_full_unstemmed | FA-RCNet: A Fused Feature Attention Network for Relationship Classification |
title_short | FA-RCNet: A Fused Feature Attention Network for Relationship Classification |
title_sort | fa rcnet a fused feature attention network for relationship classification |
topic | relationship classification attentional mechanisms feature fusion |
url | https://www.mdpi.com/2076-3417/12/23/12460 |
work_keys_str_mv | AT jiakaitian farcnetafusedfeatureattentionnetworkforrelationshipclassification AT gangli farcnetafusedfeatureattentionnetworkforrelationshipclassification AT minglezhou farcnetafusedfeatureattentionnetworkforrelationshipclassification AT minli farcnetafusedfeatureattentionnetworkforrelationshipclassification AT delonghan farcnetafusedfeatureattentionnetworkforrelationshipclassification |