Relational prompt-based single-module single-step model for relational triple extraction

The relational triple extraction is a fundamental and essential information extraction task. The existing approaches of relation triple extraction achieve considerable performance but still suffer from 1) treating the relation between entities as a meaningless label while ignoring the relational sem...

Full description

Bibliographic Details
Main Authors: Zhi Zhang, Hui Liu, Junan Yang, Xiaoshuai Li
Format: Article
Language:English
Published: Elsevier 2023-10-01
Series:Journal of King Saud University: Computer and Information Sciences
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S1319157823003026
_version_ 1797629713150115840
author Zhi Zhang
Hui Liu
Junan Yang
Xiaoshuai Li
author_facet Zhi Zhang
Hui Liu
Junan Yang
Xiaoshuai Li
author_sort Zhi Zhang
collection DOAJ
description The relational triple extraction is a fundamental and essential information extraction task. The existing approaches of relation triple extraction achieve considerable performance but still suffer from 1) treating the relation between entities as a meaningless label while ignoring the relational semantic information of the relation itself and 2) ignoring the interdependence and inseparability of three elements of the triple. To address these problems, this paper proposes a Relational Prompt approach, based on which constructs a Single-module Single-step relational triple extraction model (RPSS). In particular, the proposed relational prompt approach consist of a relational hard-prompt and a relational soft-prompt, while provide take into account different level of relational semantic information, covering both the token-level and the feature-level relational prompt information. Then, we jointly encode entities and relational prompts to obtain a unified global representation. We mine deep correlations between different embeddings through attention mechanism and then construct a triple interaction matrix. Then, all triples could be directly extracted from a single module in a single step. Experiments demonstrate the effectiveness of the relational prompt approach, as well as relational semantics and triple integrity are essential for relation extraction. Experimental results on two benchmark datasets demonstrate our model outperforms current state-of-the-art models.
first_indexed 2024-03-11T10:57:57Z
format Article
id doaj.art-24c2e4e3b0a844f68c375e697eab1a6b
institution Directory Open Access Journal
issn 1319-1578
language English
last_indexed 2024-03-11T10:57:57Z
publishDate 2023-10-01
publisher Elsevier
record_format Article
series Journal of King Saud University: Computer and Information Sciences
spelling doaj.art-24c2e4e3b0a844f68c375e697eab1a6b2023-11-13T04:08:57ZengElsevierJournal of King Saud University: Computer and Information Sciences1319-15782023-10-01359101748Relational prompt-based single-module single-step model for relational triple extractionZhi Zhang0Hui Liu1Junan Yang2Xiaoshuai Li3College of Electronic Engineering, National University of Defense Technology, Hefei 230037, ChinaCollege of Electronic Engineering, National University of Defense Technology, Hefei 230037, ChinaCorresponding author.; College of Electronic Engineering, National University of Defense Technology, Hefei 230037, ChinaCollege of Electronic Engineering, National University of Defense Technology, Hefei 230037, ChinaThe relational triple extraction is a fundamental and essential information extraction task. The existing approaches of relation triple extraction achieve considerable performance but still suffer from 1) treating the relation between entities as a meaningless label while ignoring the relational semantic information of the relation itself and 2) ignoring the interdependence and inseparability of three elements of the triple. To address these problems, this paper proposes a Relational Prompt approach, based on which constructs a Single-module Single-step relational triple extraction model (RPSS). In particular, the proposed relational prompt approach consist of a relational hard-prompt and a relational soft-prompt, while provide take into account different level of relational semantic information, covering both the token-level and the feature-level relational prompt information. Then, we jointly encode entities and relational prompts to obtain a unified global representation. We mine deep correlations between different embeddings through attention mechanism and then construct a triple interaction matrix. Then, all triples could be directly extracted from a single module in a single step. Experiments demonstrate the effectiveness of the relational prompt approach, as well as relational semantics and triple integrity are essential for relation extraction. Experimental results on two benchmark datasets demonstrate our model outperforms current state-of-the-art models.http://www.sciencedirect.com/science/article/pii/S1319157823003026Nature language processInformation extractionRelational triple extractionPrompt learningAttention mechanismRelational semantic
spellingShingle Zhi Zhang
Hui Liu
Junan Yang
Xiaoshuai Li
Relational prompt-based single-module single-step model for relational triple extraction
Journal of King Saud University: Computer and Information Sciences
Nature language process
Information extraction
Relational triple extraction
Prompt learning
Attention mechanism
Relational semantic
title Relational prompt-based single-module single-step model for relational triple extraction
title_full Relational prompt-based single-module single-step model for relational triple extraction
title_fullStr Relational prompt-based single-module single-step model for relational triple extraction
title_full_unstemmed Relational prompt-based single-module single-step model for relational triple extraction
title_short Relational prompt-based single-module single-step model for relational triple extraction
title_sort relational prompt based single module single step model for relational triple extraction
topic Nature language process
Information extraction
Relational triple extraction
Prompt learning
Attention mechanism
Relational semantic
url http://www.sciencedirect.com/science/article/pii/S1319157823003026
work_keys_str_mv AT zhizhang relationalpromptbasedsinglemodulesinglestepmodelforrelationaltripleextraction
AT huiliu relationalpromptbasedsinglemodulesinglestepmodelforrelationaltripleextraction
AT junanyang relationalpromptbasedsinglemodulesinglestepmodelforrelationaltripleextraction
AT xiaoshuaili relationalpromptbasedsinglemodulesinglestepmodelforrelationaltripleextraction