Table of Contents
Fetching ...

Generalization Bounds for Spectral GNNs via Fourier Domain Analysis

Vahan A. Martirosyan, Daniele Malitesta, Hugues Talbot, Jhony H. Giraldo, Fragkiskos D. Malliaros

Abstract

Spectral graph neural networks learn graph filters, but their behavior with increasing depth and polynomial order is not well understood. We analyze these models in the graph Fourier domain, where each layer becomes an element-wise frequency update, separating the fixed spectrum from trainable parameters and making depth and order explicit. In this setting, we show that Gaussian complexity is invariant under the Graph Fourier Transform, which allows us to derive data-dependent, depth, and order-aware generalization bounds together with stability estimates. In the linear case, our bounds are tighter, and on real graphs, the data-dependent term correlates with the generalization gap across polynomial bases, highlighting practical choices that avoid frequency amplification across layers.

Generalization Bounds for Spectral GNNs via Fourier Domain Analysis

Abstract

Spectral graph neural networks learn graph filters, but their behavior with increasing depth and polynomial order is not well understood. We analyze these models in the graph Fourier domain, where each layer becomes an element-wise frequency update, separating the fixed spectrum from trainable parameters and making depth and order explicit. In this setting, we show that Gaussian complexity is invariant under the Graph Fourier Transform, which allows us to derive data-dependent, depth, and order-aware generalization bounds together with stability estimates. In the linear case, our bounds are tighter, and on real graphs, the data-dependent term correlates with the generalization gap across polynomial bases, highlighting practical choices that avoid frequency amplification across layers.

Paper Structure

This paper contains 44 sections, 14 theorems, 71 equations, 8 figures, 2 tables.

Key Result

Theorem 1

With probability at least $1-\delta$, for any predictor $f \in \mathcal{F}$: where $u=n-m$, and $C_1, C_2$ are absolute constants.

Figures (8)

  • Figure 1: Basis stability across the spectrum ($K=10$).
  • Figure 2: Jacobian bound tightness.
  • Figure 3: Cora linear models.
  • Figure 4: Cora nonlinear model.
  • Figure 5: Chameleon linear models.
  • ...and 3 more figures

Theorems & Definitions (24)

  • Definition 1: Transductive Rademacher Complexity el2009transductive
  • Theorem 1: Transductive Generalization Bound el2009transductive
  • Definition 2: FTGC
  • Lemma 1: TRC Bound by FTGC
  • Theorem 2: Generalization Gap by FTGC
  • Definition 3: Generalized Vandermonde Matrix
  • Lemma 2: FTGC Invariance
  • Lemma 3: Lipschitz Preservation of Transformed Activation
  • Definition 4: Input Signal Spectral Energy
  • Theorem 3: Data-Dependent FTGC Bound
  • ...and 14 more