DG-based SPO tuple recognition using self-attention M-Bi-LSTM
This study proposes a dependency grammar-based self-attention multilayered bidirectional long short-term memory (DG-M-Bi-LSTM) model for subject?predicate?object (SPO) tuple recognition from natural language (NL) sentences. To add recent knowledge to the knowledge base autonomously, it is essential...
Main Author: | Joon-young Jung |
---|---|
Format: | Article |
Language: | English |
Published: |
Electronics and Telecommunications Research Institute (ETRI)
2022-06-01
|
Series: | ETRI Journal |
Subjects: | |
Online Access: | https://doi.org/10.4218/etrij.2020-0460 |
Similar Items
-
On Existence of Prime K-Tuples Conjecture for Positive Proportion of Admissible K-Tuples
by: Ashish Mor, et al.
Published: (2024-03-01) -
Partial Discharge Detection and Recognition in Insulated Overhead Conductor Based on Bi-LSTM with Attention Mechanism
by: Yanhui Xi, et al.
Published: (2023-05-01) -
k-Tuple total domination and Mycieleskian graphs
by: Adel P. Kazemi
Published: (2012-03-01) -
Prediction of Aerosol Extinction Coefficient in Coastal Areas of South China Based on Attention-BiLSTM
by: Zhou Ye, et al.
Published: (2022-04-01) -
Enhancer-LSTMAtt: A Bi-LSTM and Attention-Based Deep Learning Method for Enhancer Recognition
by: Guohua Huang, et al.
Published: (2022-07-01)