Hypergraph transformer for semi-supervised classification

Hypergraphs play a pivotal role in the modelling of data featuring higher-order relations involving more than two entities. Hypergraph neural networks emerge as a powerful tool for processing hypergraph-structured data, delivering remarkable performance across various tasks, e.g., hypergraph node cl...

Ausführliche Beschreibung

Bibliographische Detailangaben
Hauptverfasser: Liu, Z, Tang, B, Ye, Z, Dong, X, Chen, S, Wang, Y
Format: Conference item
Sprache:English
Veröffentlicht: IEEE 2024
_version_ 1826313719491067904
author Liu, Z
Tang, B
Ye, Z
Dong, X
Chen, S
Wang, Y
author_facet Liu, Z
Tang, B
Ye, Z
Dong, X
Chen, S
Wang, Y
author_sort Liu, Z
collection OXFORD
description Hypergraphs play a pivotal role in the modelling of data featuring higher-order relations involving more than two entities. Hypergraph neural networks emerge as a powerful tool for processing hypergraph-structured data, delivering remarkable performance across various tasks, e.g., hypergraph node classification. However, these models struggle to capture global structural information due to their reliance on local message passing. To address this challenge, we propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT). HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges. To incorporate local structural information, HyperGT has two distinct designs: i) a positional encoding based on the hypergraph incidence matrix, offering valuable insights into node-node and hyperedge-hyperedge interactions; and ii) a hypergraph structure regularization in the loss function, capturing connectivities between nodes and hyperedges. Through these designs, HyperGT achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns. Extensive experiments conducted on real-world hypergraph node classification tasks showcase that HyperGT consistently outperforms existing methods, establishing new state-of-the-art benchmarks. Ablation studies affirm the effectiveness of the individual designs of our model.
first_indexed 2024-09-25T04:19:23Z
format Conference item
id oxford-uuid:bd07bdad-52db-47c2-8eea-5483fe861c1d
institution University of Oxford
language English
last_indexed 2024-09-25T04:19:23Z
publishDate 2024
publisher IEEE
record_format dspace
spelling oxford-uuid:bd07bdad-52db-47c2-8eea-5483fe861c1d2024-07-30T14:33:04ZHypergraph transformer for semi-supervised classificationConference itemhttp://purl.org/coar/resource_type/c_5794uuid:bd07bdad-52db-47c2-8eea-5483fe861c1dEnglishSymplectic ElementsIEEE2024Liu, ZTang, BYe, ZDong, XChen, SWang, YHypergraphs play a pivotal role in the modelling of data featuring higher-order relations involving more than two entities. Hypergraph neural networks emerge as a powerful tool for processing hypergraph-structured data, delivering remarkable performance across various tasks, e.g., hypergraph node classification. However, these models struggle to capture global structural information due to their reliance on local message passing. To address this challenge, we propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT). HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges. To incorporate local structural information, HyperGT has two distinct designs: i) a positional encoding based on the hypergraph incidence matrix, offering valuable insights into node-node and hyperedge-hyperedge interactions; and ii) a hypergraph structure regularization in the loss function, capturing connectivities between nodes and hyperedges. Through these designs, HyperGT achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns. Extensive experiments conducted on real-world hypergraph node classification tasks showcase that HyperGT consistently outperforms existing methods, establishing new state-of-the-art benchmarks. Ablation studies affirm the effectiveness of the individual designs of our model.
spellingShingle Liu, Z
Tang, B
Ye, Z
Dong, X
Chen, S
Wang, Y
Hypergraph transformer for semi-supervised classification
title Hypergraph transformer for semi-supervised classification
title_full Hypergraph transformer for semi-supervised classification
title_fullStr Hypergraph transformer for semi-supervised classification
title_full_unstemmed Hypergraph transformer for semi-supervised classification
title_short Hypergraph transformer for semi-supervised classification
title_sort hypergraph transformer for semi supervised classification
work_keys_str_mv AT liuz hypergraphtransformerforsemisupervisedclassification
AT tangb hypergraphtransformerforsemisupervisedclassification
AT yez hypergraphtransformerforsemisupervisedclassification
AT dongx hypergraphtransformerforsemisupervisedclassification
AT chens hypergraphtransformerforsemisupervisedclassification
AT wangy hypergraphtransformerforsemisupervisedclassification