Table of Contents
Fetching ...

GLU: Global-Local-Uncertainty Fusion for Scalable Spatiotemporal Reconstruction and Forecasting

Linzheng Wang, Jason Chen, Nicolas Tricard, Zituo Chen, Sili Deng

Abstract

Digital twins of complex physical systems are expected to infer unobserved states from sparse measurements and predict their evolution in time, yet these two functions are typically treated as separate tasks. Here we present GLU, a Global-Local-Uncertainty framework that formulates sparse reconstruction and dynamic forecasting as a unified state-representation problem and introduces a structured latent assembly to both tasks. The central idea is to build a structured latent state that combines a global summary of system-level organization, local tokens anchored to available measurements, and an uncertainty-driven importance field that weights observations according to the physical informativeness. For reconstruction, GLU uses importance-aware adaptive neighborhood selection to retrieve locally relevant information while preserving global consistency and allowing flexible query resolution on arbitrary geometries. Across a suite of challenging benchmarks, GLU consistently improves reconstruction fidelity over reduced-order, convolutional, neural operator, and attention-based baselines, better preserving multi-scale structures. For forecasting, a hierarchical Leader-Follower Dynamics module evolves the latent state with substantially reduced memory growth, maintains stable rollout behavior and delays error accumulation in nonlinear dynamics. On a realistic turbulent combustion dataset, it further preserves not only sharp fronts and broadband structures in multiple physical fields, but also their cross-channel thermo-chemical couplings. Scalability tests show that these gains are achieved with substantially lower memory growth than comparable attention-based baselines. Together, these results establish GLU as a flexible and computationally practical paradigm for sparse digital twins.

GLU: Global-Local-Uncertainty Fusion for Scalable Spatiotemporal Reconstruction and Forecasting

Abstract

Digital twins of complex physical systems are expected to infer unobserved states from sparse measurements and predict their evolution in time, yet these two functions are typically treated as separate tasks. Here we present GLU, a Global-Local-Uncertainty framework that formulates sparse reconstruction and dynamic forecasting as a unified state-representation problem and introduces a structured latent assembly to both tasks. The central idea is to build a structured latent state that combines a global summary of system-level organization, local tokens anchored to available measurements, and an uncertainty-driven importance field that weights observations according to the physical informativeness. For reconstruction, GLU uses importance-aware adaptive neighborhood selection to retrieve locally relevant information while preserving global consistency and allowing flexible query resolution on arbitrary geometries. Across a suite of challenging benchmarks, GLU consistently improves reconstruction fidelity over reduced-order, convolutional, neural operator, and attention-based baselines, better preserving multi-scale structures. For forecasting, a hierarchical Leader-Follower Dynamics module evolves the latent state with substantially reduced memory growth, maintains stable rollout behavior and delays error accumulation in nonlinear dynamics. On a realistic turbulent combustion dataset, it further preserves not only sharp fronts and broadband structures in multiple physical fields, but also their cross-channel thermo-chemical couplings. Scalability tests show that these gains are achieved with substantially lower memory growth than comparable attention-based baselines. Together, these results establish GLU as a flexible and computationally practical paradigm for sparse digital twins.

Paper Structure

This paper contains 14 sections, 20 equations, 6 figures.

Table of Contents

  1. Results
  2. Discussion
  3. Methods

Figures (6)

  • Figure 1: Bridging the reality gap between sparse sensing and dynamic digital twins. a) Complex physical systems monitored by sparse, irregular sensors. b) A continuous stream of sparse, partial observations serves as the input. c) The GLU framework decomposes measurements into three complementary streams. Global tokens extract system-level patterns and long-range correlations, while local tokens preserve fine-grained sensor fidelity. A learned uncertainty signal guides the assimilation of local data. d) The GLU-assembled representations drive the digital twin, enabling both full-state inference of unobserved quantities and stable forecasting of future trajectories.
  • Figure 2: Architecture and performance of the GLU framework.a, Sparse encoding via global-local-aware encoder with bidirectional attention to transform arbitrary sparse inputs into a latent state. b, Soft adaptive reconstruction. For any query location, the model dynamically aggregates top-$k$ most relevant sensors using an importance-warped distance metric. This process decodes both the mean field value and a spatial uncertainty estimate. c, Leader-Follower-Dynamic structure to model latent temporal evolution. The global leader drives the dynamics by attending to important regions via IMP-Gating, while sensor followers update their local states by attending to the leader. d, Normalized reconstruction error across seven datasets ranging from canonical flows to chaotic reaction-diffusion and multi-physics turbulent combustion. e, Forecasting error accumulation for periodic (Collinear Re40) and chaotic (Collinear Re100, Reaction-Diffusion) dynamics.
  • Figure 3: Adaptive spatial reconstruction mechanisms and information scaling performances.a, Learned importance score distributions ($\phi$) for representative datasets. b, Multi-scale detail recovery: (1) Ground truth and scale-separated reconstructions (large/medium/small) from global-only ablation, Senseiver, and GLU; (2) Spectral and scale-wise summaries. GLU best matches the ground-truth energy spectrum and retains the most energy across scales (bar chart), yielding the lowest error at each scale. c, Mechanism of adaptive selection: (1) Visualization of the adaptive neighborhood on Collinear Flow; (2) Reconstruction error vs. local field complexity; (3) Probability density function of reconstruction errors; d, Reconstruction error versus number of sensors across six datasets.
  • Figure 4: Long-term forecasting stability on complicated dynamics.a, Latent dynamics visualization. 2D PCA projection of latent trajectories for (1) periodic (Collinear Re40, 32 sensors) and (2) chaotic (Collinear Re100, 96 sensors) flows. Observation windows contain 16 steps. (3-4) Corresponding pointwise velocity forecasting confirms that GLU-LFD avoids the phase shifts and amplitude collapses seen in baselines. b, Generalization to unseen dynamics. (1) Error evolution of the reaction-diffusion system from unseen random initial condition during training from 16 steps of sparse observations (64 sensors). (2-3) Quantitative stability metrics. GLU-LFD preserves the correct spatial ($g_r$) and temporal correlation ($g_t$) structures over time, preventing the physical degradation.
  • Figure 5: Performance of GLU in learning realistic multi-physics systems (turbulent combustion).a, Instantaneous snapshots and error maps for all channels from 1% sparse sensors. GLU minimizes structural errors at the flame front (sharp interfaces) and in turbulent regions compared to baselines, which suffer from blurring. b, Energy spectra (bottom) and log-spectral distance (LSD) scores (top). GLU adheres to the ground truth spectrum at high wavenumbers, capturing fine-scale turbulence that baselines filter out. c, Joint probability density functions visualizing cross-channel correlations. The low Jensen-Shannon divergence (JSD) scores confirm that GLU accurately preserves the thermodynamic and kinematic couplings inherent in combustion.
  • ...and 1 more figures