RAGAT: Relation Aware Graph Attention Network for Knowledge Graph Completion

Knowledge graph completion (KGC) is the task of predicting missing links based on known triples for knowledge graphs. Several recent works suggest that Graph Neural Networks (GNN) that exploit graph structures achieve promising performance on KGC. These models learn information called messages from...

Full description

Bibliographic Details
Main Authors: Xiyang Liu, Huobin Tan, Qinghong Chen, Guangyan Lin
Format: Article
Language:English
Published: IEEE 2021-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9340326/
_version_ 1818479760880173056
author Xiyang Liu
Huobin Tan
Qinghong Chen
Guangyan Lin
author_facet Xiyang Liu
Huobin Tan
Qinghong Chen
Guangyan Lin
author_sort Xiyang Liu
collection DOAJ
description Knowledge graph completion (KGC) is the task of predicting missing links based on known triples for knowledge graphs. Several recent works suggest that Graph Neural Networks (GNN) that exploit graph structures achieve promising performance on KGC. These models learn information called messages from neighboring entities and relations and then aggregate messages to update central entity representations. The drawback of existing GNN based models lies in that they tend to treat relations equally and learn fixed network parameters, overlooking the distinction of each relational information. In this work, we propose a <bold>R</bold>elation <bold>A</bold>ware <bold>G</bold>raph <bold>AT</bold>tention network (RAGAT) that constructs separate message functions for different relations, which aims at exploiting the heterogeneous characteristics of knowledge graphs. Specifically, we introduce relation specific parameters to augment the expressive capability of message functions, which enables the model to extract relational information in parameter space. To validate the effect of relation aware mechanism, RAGAT is implemented with a variety of relation aware message functions. Experiments show RAGAT outperforms state-of-the-art link prediction baselines on standard FB15k-237 and WN18RR datasets.
first_indexed 2024-12-10T11:15:01Z
format Article
id doaj.art-d3c6c39329694a8ab90d3967e89a38d8
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-12-10T11:15:01Z
publishDate 2021-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-d3c6c39329694a8ab90d3967e89a38d82022-12-22T01:51:14ZengIEEEIEEE Access2169-35362021-01-019208402084910.1109/ACCESS.2021.30555299340326RAGAT: Relation Aware Graph Attention Network for Knowledge Graph CompletionXiyang Liu0https://orcid.org/0000-0003-1120-6584Huobin Tan1https://orcid.org/0000-0003-3113-6552Qinghong Chen2https://orcid.org/0000-0001-7471-483XGuangyan Lin3School of Software, Beihang University, Beijing, ChinaSchool of Software, Beihang University, Beijing, ChinaSchool of Software, Beihang University, Beijing, ChinaSchool of Software, Beihang University, Beijing, ChinaKnowledge graph completion (KGC) is the task of predicting missing links based on known triples for knowledge graphs. Several recent works suggest that Graph Neural Networks (GNN) that exploit graph structures achieve promising performance on KGC. These models learn information called messages from neighboring entities and relations and then aggregate messages to update central entity representations. The drawback of existing GNN based models lies in that they tend to treat relations equally and learn fixed network parameters, overlooking the distinction of each relational information. In this work, we propose a <bold>R</bold>elation <bold>A</bold>ware <bold>G</bold>raph <bold>AT</bold>tention network (RAGAT) that constructs separate message functions for different relations, which aims at exploiting the heterogeneous characteristics of knowledge graphs. Specifically, we introduce relation specific parameters to augment the expressive capability of message functions, which enables the model to extract relational information in parameter space. To validate the effect of relation aware mechanism, RAGAT is implemented with a variety of relation aware message functions. Experiments show RAGAT outperforms state-of-the-art link prediction baselines on standard FB15k-237 and WN18RR datasets.https://ieeexplore.ieee.org/document/9340326/Knowledge graph completionknowledge graph embeddinggraph attention networks
spellingShingle Xiyang Liu
Huobin Tan
Qinghong Chen
Guangyan Lin
RAGAT: Relation Aware Graph Attention Network for Knowledge Graph Completion
IEEE Access
Knowledge graph completion
knowledge graph embedding
graph attention networks
title RAGAT: Relation Aware Graph Attention Network for Knowledge Graph Completion
title_full RAGAT: Relation Aware Graph Attention Network for Knowledge Graph Completion
title_fullStr RAGAT: Relation Aware Graph Attention Network for Knowledge Graph Completion
title_full_unstemmed RAGAT: Relation Aware Graph Attention Network for Knowledge Graph Completion
title_short RAGAT: Relation Aware Graph Attention Network for Knowledge Graph Completion
title_sort ragat relation aware graph attention network for knowledge graph completion
topic Knowledge graph completion
knowledge graph embedding
graph attention networks
url https://ieeexplore.ieee.org/document/9340326/
work_keys_str_mv AT xiyangliu ragatrelationawaregraphattentionnetworkforknowledgegraphcompletion
AT huobintan ragatrelationawaregraphattentionnetworkforknowledgegraphcompletion
AT qinghongchen ragatrelationawaregraphattentionnetworkforknowledgegraphcompletion
AT guangyanlin ragatrelationawaregraphattentionnetworkforknowledgegraphcompletion