Asymmetric Graph Contrastive Learning

Learning effective graph representations in an unsupervised manner is a popular research topic in graph data analysis. Recently, contrastive learning has shown its success in unsupervised graph representation learning. However, how to avoid collapsing solutions for contrastive learning methods remai...

Full description

Bibliographic Details
Main Authors: Xinglong Chang, Jianrong Wang, Rui Guo, Yingkui Wang, Weihao Li
Format: Article
Language:English
Published: MDPI AG 2023-10-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/11/21/4505
_version_ 1797631608581259264
author Xinglong Chang
Jianrong Wang
Rui Guo
Yingkui Wang
Weihao Li
author_facet Xinglong Chang
Jianrong Wang
Rui Guo
Yingkui Wang
Weihao Li
author_sort Xinglong Chang
collection DOAJ
description Learning effective graph representations in an unsupervised manner is a popular research topic in graph data analysis. Recently, contrastive learning has shown its success in unsupervised graph representation learning. However, how to avoid collapsing solutions for contrastive learning methods remains a critical challenge. In this paper, a simple method is proposed to solve this problem for graph representation learning, which is different from existing commonly used techniques (such as negative samples or predictor network). The proposed model mainly relies on an asymmetric design that consists of two graph neural networks (GNNs) with unequal depth layers to learn node representations from two augmented views and defines contrastive loss only based on positive sample pairs. The simple method has lower computational and memory complexity than existing methods. Furthermore, a theoretical analysis proves that the asymmetric design avoids collapsing solutions when training together with a stop-gradient operation. Our method is compared to nine state-of-the-art methods on six real-world datasets to demonstrate its validity and superiority. The ablation experiments further validated the essential role of the asymmetric architecture.
first_indexed 2024-03-11T11:25:52Z
format Article
id doaj.art-f779094410354db49a94fffdad4593e2
institution Directory Open Access Journal
issn 2227-7390
language English
last_indexed 2024-03-11T11:25:52Z
publishDate 2023-10-01
publisher MDPI AG
record_format Article
series Mathematics
spelling doaj.art-f779094410354db49a94fffdad4593e22023-11-10T15:08:06ZengMDPI AGMathematics2227-73902023-10-011121450510.3390/math11214505Asymmetric Graph Contrastive LearningXinglong Chang0Jianrong Wang1Rui Guo2Yingkui Wang3Weihao Li4School of New Media and Communication, Tianjin University, Tianjin 300350, ChinaSchool of New Media and Communication, Tianjin University, Tianjin 300350, ChinaCollege of Intelligence and Computing, Tianjin University, Tianjin 300350, ChinaDepartment of Computer Science and Technology, Tianjin Renai College, Tianjin 301636, ChinaData61-CSIRO, Black Mountain Laboratories, Canberra, ACT 2601, AustraliaLearning effective graph representations in an unsupervised manner is a popular research topic in graph data analysis. Recently, contrastive learning has shown its success in unsupervised graph representation learning. However, how to avoid collapsing solutions for contrastive learning methods remains a critical challenge. In this paper, a simple method is proposed to solve this problem for graph representation learning, which is different from existing commonly used techniques (such as negative samples or predictor network). The proposed model mainly relies on an asymmetric design that consists of two graph neural networks (GNNs) with unequal depth layers to learn node representations from two augmented views and defines contrastive loss only based on positive sample pairs. The simple method has lower computational and memory complexity than existing methods. Furthermore, a theoretical analysis proves that the asymmetric design avoids collapsing solutions when training together with a stop-gradient operation. Our method is compared to nine state-of-the-art methods on six real-world datasets to demonstrate its validity and superiority. The ablation experiments further validated the essential role of the asymmetric architecture.https://www.mdpi.com/2227-7390/11/21/4505contrastive learninggraph neural networksgraph representation learning
spellingShingle Xinglong Chang
Jianrong Wang
Rui Guo
Yingkui Wang
Weihao Li
Asymmetric Graph Contrastive Learning
Mathematics
contrastive learning
graph neural networks
graph representation learning
title Asymmetric Graph Contrastive Learning
title_full Asymmetric Graph Contrastive Learning
title_fullStr Asymmetric Graph Contrastive Learning
title_full_unstemmed Asymmetric Graph Contrastive Learning
title_short Asymmetric Graph Contrastive Learning
title_sort asymmetric graph contrastive learning
topic contrastive learning
graph neural networks
graph representation learning
url https://www.mdpi.com/2227-7390/11/21/4505
work_keys_str_mv AT xinglongchang asymmetricgraphcontrastivelearning
AT jianrongwang asymmetricgraphcontrastivelearning
AT ruiguo asymmetricgraphcontrastivelearning
AT yingkuiwang asymmetricgraphcontrastivelearning
AT weihaoli asymmetricgraphcontrastivelearning