Embodied Interpretability: Linking Causal Understanding to Generalization in Vision-Language-Action Models

arXiv cs.RO / 5/4/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper argues that vision-language-action (VLA) policies often break under distribution shift because they may rely on spurious visual correlations rather than task-relevant causal factors.
  • It reframes visual-action attribution as an interventional estimation problem and proposes the Interventional Significance Score (ISS) to measure the causal impact of visual regions on action predictions.
  • It also introduces the Nuisance Mass Ratio (NMR) to quantify how much attribution is directed toward task-irrelevant features.
  • The authors provide statistical analyses showing that ISS supports unbiased estimation and identify conditions where action prediction error can serve as a proxy for causal influence.
  • Experiments across multiple manipulation tasks suggest NMR correlates with generalization performance and that ISS produces more faithful explanations than existing interpretability methods, offering a diagnostic for causal misalignment in embodied policies.

Abstract

Vision-Language-Action (VLA) policies often fail under distribution shift, suggesting that decisions may depend on spurious visual correlations rather than task-relevant causes. We formulate visual-action attribution as an interventional estimation problem. Accordingly, we introduce the Interventional Significance Score (ISS), an interventional masking procedure for estimating the causal influence of visual regions on action predictions, and the Nuisance Mass Ratio (NMR), a scalar measure of attribution to task-irrelevant features. We analyze the statistical properties of ISS and show that it admits unbiased estimation, and we characterize conditions under which action prediction error provides a valid proxy for causal influence. Experiments across diverse manipulation tasks indicate that NMR predicts generalization behavior and that ISS yields more faithful explanations than existing interpretability methods. These results suggest that interventional attribution provides a simple diagnostic approach for identifying causal misalignment in embodied policies.