Hypergraph convolution and hypergraph attention

Recently, graph neural networks have attracted great attention and achieved prominent performance in various research fields. Most of those algorithms have assumed pairwise relationships of objects of interest. However, in many real applications, the relationships between objects are in higher-order...

Full description

Bibliographic Details
Main Authors: Bai, S, Zhang, F, Torr, PHS
Format: Journal article
Language:English
Published: Elseveir 2020
_version_ 1826261201384898560
author Bai, S
Zhang, F
Torr, PHS
author_facet Bai, S
Zhang, F
Torr, PHS
author_sort Bai, S
collection OXFORD
description Recently, graph neural networks have attracted great attention and achieved prominent performance in various research fields. Most of those algorithms have assumed pairwise relationships of objects of interest. However, in many real applications, the relationships between objects are in higher-order, beyond a pairwise formulation. To efficiently learn deep embeddings on the high-order graph-structured data, we introduce two end-to-end trainable operators to the family of graph neural networks, i.e., hypergraph convolution and hypergraph attention. Whilst hypergraph convolution defines the basic formulation of performing convolution on a hypergraph, hypergraph attention further enhances the capacity of representation learning by leveraging an attention module. With the two operators, a graph neural network is readily extended to a more flexible model and applied to diverse applications where non-pairwise relationships are observed. Extensive experimental results with semi-supervised node classification demonstrate the effectiveness of hypergraph convolution and hypergraph attention.
first_indexed 2024-03-06T19:17:48Z
format Journal article
id oxford-uuid:1904e878-a7c1-4bd3-be1e-701049091d19
institution University of Oxford
language English
last_indexed 2024-03-06T19:17:48Z
publishDate 2020
publisher Elseveir
record_format dspace
spelling oxford-uuid:1904e878-a7c1-4bd3-be1e-701049091d192022-03-26T10:46:31ZHypergraph convolution and hypergraph attentionJournal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:1904e878-a7c1-4bd3-be1e-701049091d19EnglishSymplectic ElementsElseveir2020Bai, SZhang, FTorr, PHSRecently, graph neural networks have attracted great attention and achieved prominent performance in various research fields. Most of those algorithms have assumed pairwise relationships of objects of interest. However, in many real applications, the relationships between objects are in higher-order, beyond a pairwise formulation. To efficiently learn deep embeddings on the high-order graph-structured data, we introduce two end-to-end trainable operators to the family of graph neural networks, i.e., hypergraph convolution and hypergraph attention. Whilst hypergraph convolution defines the basic formulation of performing convolution on a hypergraph, hypergraph attention further enhances the capacity of representation learning by leveraging an attention module. With the two operators, a graph neural network is readily extended to a more flexible model and applied to diverse applications where non-pairwise relationships are observed. Extensive experimental results with semi-supervised node classification demonstrate the effectiveness of hypergraph convolution and hypergraph attention.
spellingShingle Bai, S
Zhang, F
Torr, PHS
Hypergraph convolution and hypergraph attention
title Hypergraph convolution and hypergraph attention
title_full Hypergraph convolution and hypergraph attention
title_fullStr Hypergraph convolution and hypergraph attention
title_full_unstemmed Hypergraph convolution and hypergraph attention
title_short Hypergraph convolution and hypergraph attention
title_sort hypergraph convolution and hypergraph attention
work_keys_str_mv AT bais hypergraphconvolutionandhypergraphattention
AT zhangf hypergraphconvolutionandhypergraphattention
AT torrphs hypergraphconvolutionandhypergraphattention