SCGG: A deep structure-conditioned graph generative model.

Deep learning-based graph generation approaches have remarkable capacities for graph data modeling, allowing them to solve a wide range of real-world problems. Making these methods able to consider different conditions during the generation procedure even increases their effectiveness by empowering...

Full description

Bibliographic Details
Main Authors: Faezeh Faez, Negin Hashemi Dijujin, Mahdieh Soleymani Baghshah, Hamid R Rabiee
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2022-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0277887
_version_ 1797959195298889728
author Faezeh Faez
Negin Hashemi Dijujin
Mahdieh Soleymani Baghshah
Hamid R Rabiee
author_facet Faezeh Faez
Negin Hashemi Dijujin
Mahdieh Soleymani Baghshah
Hamid R Rabiee
author_sort Faezeh Faez
collection DOAJ
description Deep learning-based graph generation approaches have remarkable capacities for graph data modeling, allowing them to solve a wide range of real-world problems. Making these methods able to consider different conditions during the generation procedure even increases their effectiveness by empowering them to generate new graph samples that meet the desired criteria. This paper presents a conditional deep graph generation method called SCGG that considers a particular type of structural conditions. Specifically, our proposed SCGG model takes an initial subgraph and autoregressively generates new nodes and their corresponding edges on top of the given conditioning substructure. The architecture of SCGG consists of a graph representation learning network and an autoregressive generative model, which is trained end-to-end. More precisely, the graph representation learning network is designed to compute continuous representations for each node in a graph, which are not only affected by the features of adjacent nodes, but also by the ones of farther nodes. This network is primarily responsible for providing the generation procedure with the structural condition, while the autoregressive generative model mainly maintains the generation history. Using this model, we can address graph completion, a rampant and inherently difficult problem of recovering missing nodes and their associated edges of partially observed graphs. The computational complexity of the SCGG method is shown to be linear in the number of graph nodes. Experimental results on both synthetic and real-world datasets demonstrate the superiority of our method compared with state-of-the-art baselines.
first_indexed 2024-04-11T00:28:58Z
format Article
id doaj.art-f8ed18815b5148ea868fbe4050faea6b
institution Directory Open Access Journal
issn 1932-6203
language English
last_indexed 2024-04-11T00:28:58Z
publishDate 2022-01-01
publisher Public Library of Science (PLoS)
record_format Article
series PLoS ONE
spelling doaj.art-f8ed18815b5148ea868fbe4050faea6b2023-01-08T05:31:37ZengPublic Library of Science (PLoS)PLoS ONE1932-62032022-01-011711e027788710.1371/journal.pone.0277887SCGG: A deep structure-conditioned graph generative model.Faezeh FaezNegin Hashemi DijujinMahdieh Soleymani BaghshahHamid R RabieeDeep learning-based graph generation approaches have remarkable capacities for graph data modeling, allowing them to solve a wide range of real-world problems. Making these methods able to consider different conditions during the generation procedure even increases their effectiveness by empowering them to generate new graph samples that meet the desired criteria. This paper presents a conditional deep graph generation method called SCGG that considers a particular type of structural conditions. Specifically, our proposed SCGG model takes an initial subgraph and autoregressively generates new nodes and their corresponding edges on top of the given conditioning substructure. The architecture of SCGG consists of a graph representation learning network and an autoregressive generative model, which is trained end-to-end. More precisely, the graph representation learning network is designed to compute continuous representations for each node in a graph, which are not only affected by the features of adjacent nodes, but also by the ones of farther nodes. This network is primarily responsible for providing the generation procedure with the structural condition, while the autoregressive generative model mainly maintains the generation history. Using this model, we can address graph completion, a rampant and inherently difficult problem of recovering missing nodes and their associated edges of partially observed graphs. The computational complexity of the SCGG method is shown to be linear in the number of graph nodes. Experimental results on both synthetic and real-world datasets demonstrate the superiority of our method compared with state-of-the-art baselines.https://doi.org/10.1371/journal.pone.0277887
spellingShingle Faezeh Faez
Negin Hashemi Dijujin
Mahdieh Soleymani Baghshah
Hamid R Rabiee
SCGG: A deep structure-conditioned graph generative model.
PLoS ONE
title SCGG: A deep structure-conditioned graph generative model.
title_full SCGG: A deep structure-conditioned graph generative model.
title_fullStr SCGG: A deep structure-conditioned graph generative model.
title_full_unstemmed SCGG: A deep structure-conditioned graph generative model.
title_short SCGG: A deep structure-conditioned graph generative model.
title_sort scgg a deep structure conditioned graph generative model
url https://doi.org/10.1371/journal.pone.0277887
work_keys_str_mv AT faezehfaez scggadeepstructureconditionedgraphgenerativemodel
AT neginhashemidijujin scggadeepstructureconditionedgraphgenerativemodel
AT mahdiehsoleymanibaghshah scggadeepstructureconditionedgraphgenerativemodel
AT hamidrrabiee scggadeepstructureconditionedgraphgenerativemodel