GLU: Global-Local-Uncertainty Fusion for Scalable Spatiotemporal Reconstruction and Forecasting

arXiv cs.LG / 3/30/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • GLU (Global-Local-Uncertainty Fusion) proposes a unified framework that treats sparse spatiotemporal reconstruction and time-dynamic forecasting as one latent state representation problem for digital twins.
  • The method builds a structured latent state combining a global system summary, measurement-anchored local tokens, and an uncertainty/importance field that weights observations by physical informativeness.
  • For reconstruction, GLU uses importance-aware adaptive neighborhood selection to retrieve locally relevant information while maintaining global consistency and supporting flexible queries over arbitrary geometries.
  • For forecasting, it introduces a hierarchical Leader–Follower Dynamics module that evolves the latent state with reduced memory growth and more stable rollouts, delaying error accumulation in nonlinear dynamics.
  • Experiments on multiple benchmarks (including a turbulent combustion dataset) show improved reconstruction/forecast fidelity over reduced-order, convolutional, neural operator, and attention baselines, with gains achieved using substantially lower memory growth.

Abstract

Digital twins of complex physical systems are expected to infer unobserved states from sparse measurements and predict their evolution in time, yet these two functions are typically treated as separate tasks. Here we present GLU, a Global-Local-Uncertainty framework that formulates sparse reconstruction and dynamic forecasting as a unified state-representation problem and introduces a structured latent assembly to both tasks. The central idea is to build a structured latent state that combines a global summary of system-level organization, local tokens anchored to available measurements, and an uncertainty-driven importance field that weights observations according to the physical informativeness. For reconstruction, GLU uses importance-aware adaptive neighborhood selection to retrieve locally relevant information while preserving global consistency and allowing flexible query resolution on arbitrary geometries. Across a suite of challenging benchmarks, GLU consistently improves reconstruction fidelity over reduced-order, convolutional, neural operator, and attention-based baselines, better preserving multi-scale structures. For forecasting, a hierarchical Leader-Follower Dynamics module evolves the latent state with substantially reduced memory growth, maintains stable rollout behavior and delays error accumulation in nonlinear dynamics. On a realistic turbulent combustion dataset, it further preserves not only sharp fronts and broadband structures in multiple physical fields, but also their cross-channel thermo-chemical couplings. Scalability tests show that these gains are achieved with substantially lower memory growth than comparable attention-based baselines. Together, these results establish GLU as a flexible and computationally practical paradigm for sparse digital twins.