Multi-Task Network Representation Learning
Networks, such as social networks, biochemical networks, and protein-protein interaction networks are ubiquitous in the real world. Network representation learning aims to embed nodes in a network as low-dimensional, dense, real-valued vectors, and facilitate downstream network analysis. The existin...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2020-01-01
|
Series: | Frontiers in Neuroscience |
Subjects: | |
Online Access: | https://www.frontiersin.org/article/10.3389/fnins.2020.00001/full |
_version_ | 1819080492150947840 |
---|---|
author | Yu Xie Peixuan Jin Maoguo Gong Chen Zhang Bin Yu |
author_facet | Yu Xie Peixuan Jin Maoguo Gong Chen Zhang Bin Yu |
author_sort | Yu Xie |
collection | DOAJ |
description | Networks, such as social networks, biochemical networks, and protein-protein interaction networks are ubiquitous in the real world. Network representation learning aims to embed nodes in a network as low-dimensional, dense, real-valued vectors, and facilitate downstream network analysis. The existing embedding methods commonly endeavor to capture structure information in a network, but lack of consideration of subsequent tasks and synergies between these tasks, which are of equal importance for learning desirable network representations. To address this issue, we propose a novel multi-task network representation learning (MTNRL) framework, which is end-to-end and more effective for underlying tasks. The original network and the incomplete network share a unified embedding layer followed by node classification and link prediction tasks that simultaneously perform on the embedding vectors. By optimizing the multi-task loss function, our framework jointly learns task-oriented embedding representations for each node. Besides, our framework is suitable for all network embedding methods, and the experiment results on several benchmark datasets demonstrate the effectiveness of the proposed framework compared with state-of-the-art methods. |
first_indexed | 2024-12-21T19:45:44Z |
format | Article |
id | doaj.art-e594180150264de18cb92dc11d5e45cf |
institution | Directory Open Access Journal |
issn | 1662-453X |
language | English |
last_indexed | 2024-12-21T19:45:44Z |
publishDate | 2020-01-01 |
publisher | Frontiers Media S.A. |
record_format | Article |
series | Frontiers in Neuroscience |
spelling | doaj.art-e594180150264de18cb92dc11d5e45cf2022-12-21T18:52:21ZengFrontiers Media S.A.Frontiers in Neuroscience1662-453X2020-01-011410.3389/fnins.2020.00001494995Multi-Task Network Representation LearningYu Xie0Peixuan Jin1Maoguo Gong2Chen Zhang3Bin Yu4School of Computer Science and Technology, Xidian University, Xi'an, ChinaSchool of Computer Science and Technology, Xidian University, Xi'an, ChinaKey Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Electronic Engineering, Xidian University, Xi'an, ChinaSchool of Computer Science and Technology, Xidian University, Xi'an, ChinaSchool of Computer Science and Technology, Xidian University, Xi'an, ChinaNetworks, such as social networks, biochemical networks, and protein-protein interaction networks are ubiquitous in the real world. Network representation learning aims to embed nodes in a network as low-dimensional, dense, real-valued vectors, and facilitate downstream network analysis. The existing embedding methods commonly endeavor to capture structure information in a network, but lack of consideration of subsequent tasks and synergies between these tasks, which are of equal importance for learning desirable network representations. To address this issue, we propose a novel multi-task network representation learning (MTNRL) framework, which is end-to-end and more effective for underlying tasks. The original network and the incomplete network share a unified embedding layer followed by node classification and link prediction tasks that simultaneously perform on the embedding vectors. By optimizing the multi-task loss function, our framework jointly learns task-oriented embedding representations for each node. Besides, our framework is suitable for all network embedding methods, and the experiment results on several benchmark datasets demonstrate the effectiveness of the proposed framework compared with state-of-the-art methods.https://www.frontiersin.org/article/10.3389/fnins.2020.00001/fullmulti-task learningrepresentation learningnode classificationlink predictiongraph neural network |
spellingShingle | Yu Xie Peixuan Jin Maoguo Gong Chen Zhang Bin Yu Multi-Task Network Representation Learning Frontiers in Neuroscience multi-task learning representation learning node classification link prediction graph neural network |
title | Multi-Task Network Representation Learning |
title_full | Multi-Task Network Representation Learning |
title_fullStr | Multi-Task Network Representation Learning |
title_full_unstemmed | Multi-Task Network Representation Learning |
title_short | Multi-Task Network Representation Learning |
title_sort | multi task network representation learning |
topic | multi-task learning representation learning node classification link prediction graph neural network |
url | https://www.frontiersin.org/article/10.3389/fnins.2020.00001/full |
work_keys_str_mv | AT yuxie multitasknetworkrepresentationlearning AT peixuanjin multitasknetworkrepresentationlearning AT maoguogong multitasknetworkrepresentationlearning AT chenzhang multitasknetworkrepresentationlearning AT binyu multitasknetworkrepresentationlearning |