Table of Contents
Fetching ...

Project and Generate: Divergence-Free Neural Operators for Incompressible Flows

Xigui Li, Hongwei Zhang, Ruoxi Jiang, Deshu Chen, Chensen Lin, Limei Han, Yuan Qi, Xin Guo, Yuan Cheng

Abstract

Learning-based models for fluid dynamics often operate in unconstrained function spaces, leading to physically inadmissible, unstable simulations. While penalty-based methods offer soft regularization, they provide no structural guarantees, resulting in spurious divergence and long-term collapse. In this work, we introduce a unified framework that enforces the incompressible continuity equation as a hard, intrinsic constraint for both deterministic and generative modeling. First, to project deterministic models onto the divergence-free subspace, we integrate a differentiable spectral Leray projection grounded in the Helmholtz-Hodge decomposition, which restricts the regression hypothesis space to physically admissible velocity fields. Second, to generate physically consistent distributions, we show that simply projecting model outputs is insufficient when the prior is incompatible. To address this, we construct a divergence-free Gaussian reference measure via a curl-based pushforward, ensuring the entire probability flow remains subspace-consistent by construction. Experiments on 2D Navier-Stokes equations demonstrate exact incompressibility up to discretization error and substantially improved stability and physical consistency.

Project and Generate: Divergence-Free Neural Operators for Incompressible Flows

Abstract

Learning-based models for fluid dynamics often operate in unconstrained function spaces, leading to physically inadmissible, unstable simulations. While penalty-based methods offer soft regularization, they provide no structural guarantees, resulting in spurious divergence and long-term collapse. In this work, we introduce a unified framework that enforces the incompressible continuity equation as a hard, intrinsic constraint for both deterministic and generative modeling. First, to project deterministic models onto the divergence-free subspace, we integrate a differentiable spectral Leray projection grounded in the Helmholtz-Hodge decomposition, which restricts the regression hypothesis space to physically admissible velocity fields. Second, to generate physically consistent distributions, we show that simply projecting model outputs is insufficient when the prior is incompatible. To address this, we construct a divergence-free Gaussian reference measure via a curl-based pushforward, ensuring the entire probability flow remains subspace-consistent by construction. Experiments on 2D Navier-Stokes equations demonstrate exact incompressibility up to discretization error and substantially improved stability and physical consistency.

Paper Structure

This paper contains 49 sections, 4 theorems, 38 equations, 10 figures, 3 tables, 3 algorithms.

Key Result

Lemma 4.1

The operator $\nabla^\perp$ is a bounded linear map whose image is exactly the closed subspace of divergence-free, zero-mean vector fields in $L^2$.

Figures (10)

  • Figure 1: Motivation for the Project & Generate framework illustrated using numerical solvers.(a) Visual comparison. Velocity streamlines (top, colored by error magnitude), divergence fields (middle), and recovered physical pressure (bottom). The Baseline approach (middle), which enforces only the momentum equation, exhibits severe violations of mass conservation with nonzero divergence and noisy pressure fields. A Leray-projected solution (right) enforces incompressibility via a spectral Leray projection, yielding a divergence-free velocity field ($\nabla \cdot \boldsymbol{u} = 0$) with machine precision and recovering a smooth, physically consistent pressure field. The result is visually indistinguishable from the Ground Truth numerical solution (left). (b) Computational paradigms. Left: Classical numerical solvers (CFD) enforce incompressibility through iterative Poisson solves, which require accessing the pressure variable and are computationally expensive. Middle: Unconstrained end-to-end learning approaches typically violate the continuity constraint. Right: We leverage Leray projection to decouple dynamics from constraint enforcement, enabling hard physical constraints to be imposed via a fast, differentiable spectral operator.
  • Figure 2: Schematic of the proposed Projective Framework. The grey surface represents the divergence-free subspace $\mathcal{V}$ (the physical manifold). (a) Projective Regression: The framework takes an input and uses a base neural operator $f_\theta$ to produce an intermediate prediction in the ambient space (shown hovering above the manifold). This unconstrained prediction is then strictly mapped onto $\mathcal{V}$ via the spectral Leray projection $\mathcal{P}$. The composite system constitutes a Divergence-Free Neural Operator, ensuring the final output is physically consistent. (b) Projective Generative Model: The generation process is initialized with a sample from a divergence-free noise distribution (bottom left) constrained to $\mathcal{V}$. During the probability flow evolution ($\tau \in [0, 1]$), the vector field predicted by $f_\theta$ is continuously projected via $\mathcal{P}$ (red arrows) back onto the manifold. This enforces hard constraints at every step of the flow matching process, ensuring the entire trajectory stays within the physically admissible subspace.
  • Figure 3: Flow Field Visualization.Top Row: Snapshot of velocity components ($u, v$). Bottom Row: Streamlines and Pressure field. The pressure is reconstructed from the predicted velocity field via the pressure Poisson equation, serving as a proxy for the dynamical consistency of the flow structure.
  • Figure 4: Enstrophy Spectrum Analysis. Comparison of energy spectra. Our method accurately recovers the inertial range scalings, preserving the correct energy cascade across scales.
  • Figure 5: Physical Field Reconstruction (Prediction). A unified view of the kinematic and topological state. Top: Velocity fields. Middle: Conservation metrics showing baseline failure. Bottom: Topology.
  • ...and 5 more figures

Theorems & Definitions (6)

  • Lemma 4.1
  • Theorem 4.2: Existence of a Divergence-Free Flow Matching Vector Field
  • Lemma 3.1: Properties of the Curl Push-forward
  • proof
  • Theorem 4.1: Existence of a Divergence-Free Flow Matching Vector Field
  • proof