Auto-GNN: Neural architecture search of graph neural networks
Graph neural networks (GNNs) have been widely used in various graph analysis tasks. As the graph characteristics vary significantly in real-world systems, given a specific scenario, the architecture parameters need to be tuned carefully to identify a suitable GNN. Neural architecture search (NAS) ha...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2022-11-01
|
Series: | Frontiers in Big Data |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/fdata.2022.1029307/full |
_version_ | 1797983809147240448 |
---|---|
author | Kaixiong Zhou Xiao Huang Qingquan Song Rui Chen Xia Hu |
author_facet | Kaixiong Zhou Xiao Huang Qingquan Song Rui Chen Xia Hu |
author_sort | Kaixiong Zhou |
collection | DOAJ |
description | Graph neural networks (GNNs) have been widely used in various graph analysis tasks. As the graph characteristics vary significantly in real-world systems, given a specific scenario, the architecture parameters need to be tuned carefully to identify a suitable GNN. Neural architecture search (NAS) has shown its potential in discovering the effective architectures for the learning tasks in image and language modeling. However, the existing NAS algorithms cannot be applied efficiently to GNN search problem because of two facts. First, the large-step exploration in the traditional controller fails to learn the sensitive performance variations with slight architecture modifications in GNNs. Second, the search space is composed of heterogeneous GNNs, which prevents the direct adoption of parameter sharing among them to accelerate the search progress. To tackle the challenges, we propose an automated graph neural networks (AGNN) framework, which aims to find the optimal GNN architecture efficiently. Specifically, a reinforced conservative controller is designed to explore the architecture space with small steps. To accelerate the validation, a novel constrained parameter sharing strategy is presented to regularize the weight transferring among GNNs. It avoids training from scratch and saves the computation time. Experimental results on the benchmark datasets demonstrate that the architecture identified by AGNN achieves the best performance and search efficiency, comparing with existing human-invented models and the traditional search methods. |
first_indexed | 2024-04-11T06:53:06Z |
format | Article |
id | doaj.art-192b9b096c9245b3b828556eb2a71033 |
institution | Directory Open Access Journal |
issn | 2624-909X |
language | English |
last_indexed | 2024-04-11T06:53:06Z |
publishDate | 2022-11-01 |
publisher | Frontiers Media S.A. |
record_format | Article |
series | Frontiers in Big Data |
spelling | doaj.art-192b9b096c9245b3b828556eb2a710332022-12-22T04:39:07ZengFrontiers Media S.A.Frontiers in Big Data2624-909X2022-11-01510.3389/fdata.2022.10293071029307Auto-GNN: Neural architecture search of graph neural networksKaixiong Zhou0Xiao Huang1Qingquan Song2Rui Chen3Xia Hu4DATA Lab, Department of Computer Science, Rice University, Houston, TX, United StatesDepartment of Computing, The Hong Kong Polytechnic University, Kowloon, Hong Kong SAR, ChinaLinkedIn, Sunnyvale, CA, United StatesSamsung Research America, Silicon Valley, CA, United StatesDATA Lab, Department of Computer Science, Rice University, Houston, TX, United StatesGraph neural networks (GNNs) have been widely used in various graph analysis tasks. As the graph characteristics vary significantly in real-world systems, given a specific scenario, the architecture parameters need to be tuned carefully to identify a suitable GNN. Neural architecture search (NAS) has shown its potential in discovering the effective architectures for the learning tasks in image and language modeling. However, the existing NAS algorithms cannot be applied efficiently to GNN search problem because of two facts. First, the large-step exploration in the traditional controller fails to learn the sensitive performance variations with slight architecture modifications in GNNs. Second, the search space is composed of heterogeneous GNNs, which prevents the direct adoption of parameter sharing among them to accelerate the search progress. To tackle the challenges, we propose an automated graph neural networks (AGNN) framework, which aims to find the optimal GNN architecture efficiently. Specifically, a reinforced conservative controller is designed to explore the architecture space with small steps. To accelerate the validation, a novel constrained parameter sharing strategy is presented to regularize the weight transferring among GNNs. It avoids training from scratch and saves the computation time. Experimental results on the benchmark datasets demonstrate that the architecture identified by AGNN achieves the best performance and search efficiency, comparing with existing human-invented models and the traditional search methods.https://www.frontiersin.org/articles/10.3389/fdata.2022.1029307/fullgraph neural networksautomated machine learningneural architecture searchdeep and scalable graph analysisreinforcement learning |
spellingShingle | Kaixiong Zhou Xiao Huang Qingquan Song Rui Chen Xia Hu Auto-GNN: Neural architecture search of graph neural networks Frontiers in Big Data graph neural networks automated machine learning neural architecture search deep and scalable graph analysis reinforcement learning |
title | Auto-GNN: Neural architecture search of graph neural networks |
title_full | Auto-GNN: Neural architecture search of graph neural networks |
title_fullStr | Auto-GNN: Neural architecture search of graph neural networks |
title_full_unstemmed | Auto-GNN: Neural architecture search of graph neural networks |
title_short | Auto-GNN: Neural architecture search of graph neural networks |
title_sort | auto gnn neural architecture search of graph neural networks |
topic | graph neural networks automated machine learning neural architecture search deep and scalable graph analysis reinforcement learning |
url | https://www.frontiersin.org/articles/10.3389/fdata.2022.1029307/full |
work_keys_str_mv | AT kaixiongzhou autognnneuralarchitecturesearchofgraphneuralnetworks AT xiaohuang autognnneuralarchitecturesearchofgraphneuralnetworks AT qingquansong autognnneuralarchitecturesearchofgraphneuralnetworks AT ruichen autognnneuralarchitecturesearchofgraphneuralnetworks AT xiahu autognnneuralarchitecturesearchofgraphneuralnetworks |