LGNN: a novel linear graph neural network algorithm

The emergence of deep learning has not only brought great changes in the field of image recognition, but also achieved excellent node classification performance in graph neural networks. However, the existing graph neural network framework often uses methods based on spatial domain or spectral domai...

Full description

Bibliographic Details
Main Authors: Shujuan Cao, Xiaoming Wang, Zhonglin Ye, Mingyuan Li, Haixing Zhao
Format: Article
Language:English
Published: Frontiers Media S.A. 2023-11-01
Series:Frontiers in Computational Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fncom.2023.1288842/full
_version_ 1797400568599150592
author Shujuan Cao
Shujuan Cao
Shujuan Cao
Shujuan Cao
Xiaoming Wang
Zhonglin Ye
Zhonglin Ye
Zhonglin Ye
Zhonglin Ye
Mingyuan Li
Mingyuan Li
Mingyuan Li
Mingyuan Li
Haixing Zhao
Haixing Zhao
Haixing Zhao
Haixing Zhao
author_facet Shujuan Cao
Shujuan Cao
Shujuan Cao
Shujuan Cao
Xiaoming Wang
Zhonglin Ye
Zhonglin Ye
Zhonglin Ye
Zhonglin Ye
Mingyuan Li
Mingyuan Li
Mingyuan Li
Mingyuan Li
Haixing Zhao
Haixing Zhao
Haixing Zhao
Haixing Zhao
author_sort Shujuan Cao
collection DOAJ
description The emergence of deep learning has not only brought great changes in the field of image recognition, but also achieved excellent node classification performance in graph neural networks. However, the existing graph neural network framework often uses methods based on spatial domain or spectral domain to capture network structure features. This process captures the local structural characteristics of graph data, and the convolution process has a large amount of calculation. It is necessary to use multi-channel or deep neural network structure to achieve the goal of modeling the high-order structural characteristics of the network. Therefore, this paper proposes a linear graph neural network framework [Linear Graph Neural Network (LGNN)] with superior performance. The model first preprocesses the input graph, and uses symmetric normalization and feature normalization to remove deviations in the structure and features. Then, by designing a high-order adjacency matrix propagation mechanism, LGNN enables nodes to iteratively aggregate and learn the feature information of high-order neighbors. After obtaining the node representation of the network structure, LGNN uses a simple linear mapping to maintain computational efficiency and obtain the final node representation. The experimental results show that the performance of the LGNN algorithm in some tasks is slightly worse than that of the existing mainstream graph neural network algorithms, but it shows or exceeds the machine learning performance of the existing algorithms in most graph neural network performance evaluation tasks, especially on sparse networks.
first_indexed 2024-03-09T01:57:26Z
format Article
id doaj.art-1cdccb1c4a474c4c832bb829f4ff2075
institution Directory Open Access Journal
issn 1662-5188
language English
last_indexed 2024-03-09T01:57:26Z
publishDate 2023-11-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Computational Neuroscience
spelling doaj.art-1cdccb1c4a474c4c832bb829f4ff20752023-12-08T13:00:08ZengFrontiers Media S.A.Frontiers in Computational Neuroscience1662-51882023-11-011710.3389/fncom.2023.12888421288842LGNN: a novel linear graph neural network algorithmShujuan Cao0Shujuan Cao1Shujuan Cao2Shujuan Cao3Xiaoming Wang4Zhonglin Ye5Zhonglin Ye6Zhonglin Ye7Zhonglin Ye8Mingyuan Li9Mingyuan Li10Mingyuan Li11Mingyuan Li12Haixing Zhao13Haixing Zhao14Haixing Zhao15Haixing Zhao16College of Computer, Qinghai Normal University, Xining, Qinghai, ChinaSchool of Computer Science, Shaanxi Normal University, Xi’an, Shaanxi, ChinaThe State Key Laboratory of Tibetan Intelligent Information Processing and Application, Xining, Qinghai, ChinaKey Laboratory of Tibetan Information Processing, Ministry of Education, Xining, Qinghai, ChinaSchool of Computer Science, Shaanxi Normal University, Xi’an, Shaanxi, ChinaCollege of Computer, Qinghai Normal University, Xining, Qinghai, ChinaSchool of Computer Science, Shaanxi Normal University, Xi’an, Shaanxi, ChinaThe State Key Laboratory of Tibetan Intelligent Information Processing and Application, Xining, Qinghai, ChinaKey Laboratory of Tibetan Information Processing, Ministry of Education, Xining, Qinghai, ChinaCollege of Computer, Qinghai Normal University, Xining, Qinghai, ChinaSchool of Computer Science, Shaanxi Normal University, Xi’an, Shaanxi, ChinaThe State Key Laboratory of Tibetan Intelligent Information Processing and Application, Xining, Qinghai, ChinaKey Laboratory of Tibetan Information Processing, Ministry of Education, Xining, Qinghai, ChinaCollege of Computer, Qinghai Normal University, Xining, Qinghai, ChinaSchool of Computer Science, Shaanxi Normal University, Xi’an, Shaanxi, ChinaThe State Key Laboratory of Tibetan Intelligent Information Processing and Application, Xining, Qinghai, ChinaKey Laboratory of Tibetan Information Processing, Ministry of Education, Xining, Qinghai, ChinaThe emergence of deep learning has not only brought great changes in the field of image recognition, but also achieved excellent node classification performance in graph neural networks. However, the existing graph neural network framework often uses methods based on spatial domain or spectral domain to capture network structure features. This process captures the local structural characteristics of graph data, and the convolution process has a large amount of calculation. It is necessary to use multi-channel or deep neural network structure to achieve the goal of modeling the high-order structural characteristics of the network. Therefore, this paper proposes a linear graph neural network framework [Linear Graph Neural Network (LGNN)] with superior performance. The model first preprocesses the input graph, and uses symmetric normalization and feature normalization to remove deviations in the structure and features. Then, by designing a high-order adjacency matrix propagation mechanism, LGNN enables nodes to iteratively aggregate and learn the feature information of high-order neighbors. After obtaining the node representation of the network structure, LGNN uses a simple linear mapping to maintain computational efficiency and obtain the final node representation. The experimental results show that the performance of the LGNN algorithm in some tasks is slightly worse than that of the existing mainstream graph neural network algorithms, but it shows or exceeds the machine learning performance of the existing algorithms in most graph neural network performance evaluation tasks, especially on sparse networks.https://www.frontiersin.org/articles/10.3389/fncom.2023.1288842/fullgraph neural networklinear neural networkgraph deep learninggraph representation learninghigh-order structural constraint
spellingShingle Shujuan Cao
Shujuan Cao
Shujuan Cao
Shujuan Cao
Xiaoming Wang
Zhonglin Ye
Zhonglin Ye
Zhonglin Ye
Zhonglin Ye
Mingyuan Li
Mingyuan Li
Mingyuan Li
Mingyuan Li
Haixing Zhao
Haixing Zhao
Haixing Zhao
Haixing Zhao
LGNN: a novel linear graph neural network algorithm
Frontiers in Computational Neuroscience
graph neural network
linear neural network
graph deep learning
graph representation learning
high-order structural constraint
title LGNN: a novel linear graph neural network algorithm
title_full LGNN: a novel linear graph neural network algorithm
title_fullStr LGNN: a novel linear graph neural network algorithm
title_full_unstemmed LGNN: a novel linear graph neural network algorithm
title_short LGNN: a novel linear graph neural network algorithm
title_sort lgnn a novel linear graph neural network algorithm
topic graph neural network
linear neural network
graph deep learning
graph representation learning
high-order structural constraint
url https://www.frontiersin.org/articles/10.3389/fncom.2023.1288842/full
work_keys_str_mv AT shujuancao lgnnanovellineargraphneuralnetworkalgorithm
AT shujuancao lgnnanovellineargraphneuralnetworkalgorithm
AT shujuancao lgnnanovellineargraphneuralnetworkalgorithm
AT shujuancao lgnnanovellineargraphneuralnetworkalgorithm
AT xiaomingwang lgnnanovellineargraphneuralnetworkalgorithm
AT zhonglinye lgnnanovellineargraphneuralnetworkalgorithm
AT zhonglinye lgnnanovellineargraphneuralnetworkalgorithm
AT zhonglinye lgnnanovellineargraphneuralnetworkalgorithm
AT zhonglinye lgnnanovellineargraphneuralnetworkalgorithm
AT mingyuanli lgnnanovellineargraphneuralnetworkalgorithm
AT mingyuanli lgnnanovellineargraphneuralnetworkalgorithm
AT mingyuanli lgnnanovellineargraphneuralnetworkalgorithm
AT mingyuanli lgnnanovellineargraphneuralnetworkalgorithm
AT haixingzhao lgnnanovellineargraphneuralnetworkalgorithm
AT haixingzhao lgnnanovellineargraphneuralnetworkalgorithm
AT haixingzhao lgnnanovellineargraphneuralnetworkalgorithm
AT haixingzhao lgnnanovellineargraphneuralnetworkalgorithm