Generalization error of graph neural networks in the mean-field regime

This work provides a theoretical framework for assessing the generalization error of graph neural networks in the over-parameterized regime, where the number of parameters surpasses the quantity of data points. We explore two widely utilized types of graph neural networks: graph convolutional neural...

全面介绍

书目详细资料
Main Authors: Aminian, G, He, Y, Reinert, G, Szpruch, L, Cohen, S
格式: Conference item
语言:English
出版: Proceedings of Machine Learning Research 2024
实物特征
总结:This work provides a theoretical framework for assessing the generalization error of graph neural networks in the over-parameterized regime, where the number of parameters surpasses the quantity of data points. We explore two widely utilized types of graph neural networks: graph convolutional neural networks and message passing graph neural networks. Prior to this study, existing bounds on the generalization error in the overparametrized regime were uninformative, limiting our understanding of over-parameterized network performance. Our novel approach involves deriving upper bounds within the mean-field regime for evaluating the generalization error of these graph neural networks. We establish upper bounds with a convergence rate of O(1/n), where n is the number of graph samples. These upper bounds offer a theoretical assurance of the networks’ performance on unseen data in the challenging overparameterized regime and overall contribute to our understanding of their performance.