DG-based SPO tuple recognition using self-attention M-Bi-LSTM
This study proposes a dependency grammar-based self-attention multilayered bidirectional long short-term memory (DG-M-Bi-LSTM) model for subject?predicate?object (SPO) tuple recognition from natural language (NL) sentences. To add recent knowledge to the knowledge base autonomously, it is essential...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
Electronics and Telecommunications Research Institute (ETRI)
2022-06-01
|
Series: | ETRI Journal |
Subjects: | |
Online Access: | https://doi.org/10.4218/etrij.2020-0460 |
_version_ | 1811330294010085376 |
---|---|
author | Joon-young Jung |
author_facet | Joon-young Jung |
author_sort | Joon-young Jung |
collection | DOAJ |
description | This study proposes a dependency grammar-based self-attention multilayered bidirectional long short-term memory (DG-M-Bi-LSTM) model for subject?predicate?object (SPO) tuple recognition from natural language (NL) sentences. To add recent knowledge to the knowledge base autonomously, it is essential to extract knowledge from numerous NL data. Therefore, this study proposes a high-accuracy SPO tuple recognition model that requires a small amount of learning data to extract knowledge from NL sentences. The accuracy of SPO tuple recognition using DG-M-Bi-LSTM is compared with that using NL-based self-attention multilayered bidirectional LSTM, DG-based bidirectional encoder representations from transformers (BERT), and NL-based BERT to evaluate its effectiveness. The DG-M-Bi-LSTM model achieves the best results in terms of recognition accuracy for extracting SPO tuples from NL sentences even if it has fewer deep neural network (DNN) parameters than BERT. In particular, its accuracy is better than that of BERT when the learning data are limited. Additionally, its pretrained DNN parameters can be applied to other domains because it learns the structural relations in NL sentences. |
first_indexed | 2024-04-13T15:58:58Z |
format | Article |
id | doaj.art-b4ed3cc4a7554c38a99848b513ffab35 |
institution | Directory Open Access Journal |
issn | 1225-6463 |
language | English |
last_indexed | 2024-04-13T15:58:58Z |
publishDate | 2022-06-01 |
publisher | Electronics and Telecommunications Research Institute (ETRI) |
record_format | Article |
series | ETRI Journal |
spelling | doaj.art-b4ed3cc4a7554c38a99848b513ffab352022-12-22T02:40:36ZengElectronics and Telecommunications Research Institute (ETRI)ETRI Journal1225-64632022-06-0144343844910.4218/etrij.2020-046010.4218/etrij.2020-0460DG-based SPO tuple recognition using self-attention M-Bi-LSTMJoon-young JungThis study proposes a dependency grammar-based self-attention multilayered bidirectional long short-term memory (DG-M-Bi-LSTM) model for subject?predicate?object (SPO) tuple recognition from natural language (NL) sentences. To add recent knowledge to the knowledge base autonomously, it is essential to extract knowledge from numerous NL data. Therefore, this study proposes a high-accuracy SPO tuple recognition model that requires a small amount of learning data to extract knowledge from NL sentences. The accuracy of SPO tuple recognition using DG-M-Bi-LSTM is compared with that using NL-based self-attention multilayered bidirectional LSTM, DG-based bidirectional encoder representations from transformers (BERT), and NL-based BERT to evaluate its effectiveness. The DG-M-Bi-LSTM model achieves the best results in terms of recognition accuracy for extracting SPO tuples from NL sentences even if it has fewer deep neural network (DNN) parameters than BERT. In particular, its accuracy is better than that of BERT when the learning data are limited. Additionally, its pretrained DNN parameters can be applied to other domains because it learns the structural relations in NL sentences.https://doi.org/10.4218/etrij.2020-0460dependency grammarinformation extractionlong short-term memoryspo tuple |
spellingShingle | Joon-young Jung DG-based SPO tuple recognition using self-attention M-Bi-LSTM ETRI Journal dependency grammar information extraction long short-term memory spo tuple |
title | DG-based SPO tuple recognition using self-attention M-Bi-LSTM |
title_full | DG-based SPO tuple recognition using self-attention M-Bi-LSTM |
title_fullStr | DG-based SPO tuple recognition using self-attention M-Bi-LSTM |
title_full_unstemmed | DG-based SPO tuple recognition using self-attention M-Bi-LSTM |
title_short | DG-based SPO tuple recognition using self-attention M-Bi-LSTM |
title_sort | dg based spo tuple recognition using self attention m bi lstm |
topic | dependency grammar information extraction long short-term memory spo tuple |
url | https://doi.org/10.4218/etrij.2020-0460 |
work_keys_str_mv | AT joonyoungjung dgbasedspotuplerecognitionusingselfattentionmbilstm |