Revealing decurve flows for generalized graph propagation

<p>This study addresses the limitations of the traditional analysis of message-passing, central to graph learning, by defining generalized propagation with directed and weighted graphs. The significance manifest in two ways. Firstly, we propose Generalized Propagation Neural Networks (GPNNs),...

Full description

Bibliographic Details
Main Authors: Lin, C, Ma, L, Chen, Y, Ouyang, W, Bronstein, MM, Torr, PHS
Format: Conference item
Language:English
Published: 2024
Description
Summary:<p>This study addresses the limitations of the traditional analysis of message-passing, central to graph learning, by defining generalized propagation with directed and weighted graphs. The significance manifest in two ways. Firstly, we propose Generalized Propagation Neural Networks (GPNNs), a framework that unifies most propagation-based graph neural networks. By generating directed-weighted propagation graphs with adjacency function and connectivity function, GPNNs offer enhanced insights into attention mechanisms across various graph models. We delve into the trade-offs within the design space with empirical experiments and emphasize the crucial role of the adjacency function for model expressivity via theoretical analysis. Secondly, we propose the Continuous Unified Ricci Curvature (CURC), an extension of celebrated Ollivier-Ricci Curvature for directed and weighted graphs. Theoretically, we demonstrate that CURC possesses continuity, scale invariance, and a lower bound connection with the Dirichlet isoperimetric constant validating bottleneck analysis for GPNNs. We include a preliminary exploration of learned propagation patterns in datasets, a first in the field. We observe an intriguing &ldquo;decurve flow&rdquo; - a curvature reduction during training for models with learnable propagation, revealing the evolution of propagation over time and a deeper connection to over-smoothing and bottleneck trade-off.</p>