Scale-Aware Adversarial Analysis: A Diagnostic for Generative AI in Multiscale Complex Systems

arXiv cs.LG / 5/4/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper argues that current explainable-AI methods for generative models—especially perturbation and gradient-saliency approaches—use pixel-wise changes that can create unphysical artifacts and move inputs outside the valid data distribution.
  • It introduces a scale-aware diagnostic framework based on Constrained Diffusion Decomposition (CDD), which performs physically constrained multiscale data decomposition and evaluation using diffusion-driven, scale-aware modifications.
  • Using this framework with a DDPM, the authors apply deterministic interventions in a continuous CDD-derived scale space rather than relying on unphysical input perturbations.
  • Results show that under moderate physical perturbations, an unconstrained generative model can exhibit localized structural freezing and nonlinear instability, failing to preserve cross-scale continuity and diverging when exposed to unseen physical states.
  • The authors claim the method provides a controlled physical testbed for probing algorithmic vulnerabilities and guiding future generative architectures to better respect multiscale causality.

Abstract

Complex physical systems, from supersonic turbulence to the macroscopic structure of the universe, are governed by continuous multiscale dynamics. While modern machine learning architectures excel at mapping the high-dimensional observables of these systems, it remains unclear whether they internalize the governing physical laws or merely interpolate discrete statistical correlations. Standard Explainable AI (XAI) architectures, particularly perturbation-based and gradient-saliency methods, rely on pixel-wise perturbations, which generate unphysical artifacts and push inputs off the valid empirical distribution. To resolve this, we introduce a diagnostic framework driven by Constrained Diffusion Decomposition (CDD), a diffusion-based multiscale data decomposition algorithm that enables physically constrained data generation and model evaluation via scale-aware modifications. Applying this framework to a Denoising Diffusion Probabilistic Model (DDPM), we execute deterministic interventions directly within the continuous, CDD-based scale space. We demonstrate that under moderate physical perturbations, the unconstrained generative model exhibits localized structural freezing and non-linear instability rather than continuous PDE-like responses. The network fails to maintain cross-scale continuity, causing the generative trajectory to diverge when pushed into unseen physical states. By synthesizing a continuum of physically coherent states, this scale-informed methodology establishes a controlled test ground to evaluate algorithmic vulnerabilities, providing the rigorous physical constraints necessary for future architectures to respect the multiscale causality of the natural universe.