Estimating Joint Interventional Distributions from Marginal Interventional Data
arXiv stat.ML / 4/20/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper presents a method to derive the full joint conditional distribution of variables by leveraging both observational data and interventional (marginal) interventional data using the Maximum Entropy principle.
- It extends the Causal Maximum Entropy framework to incorporate interventional constraints, and proves via Lagrange duality that the resulting solution remains within the exponential family.
- The proposed approach supports two key applications when only marginal interventional distributions are available for subsets of variables: causal feature selection from a mixture of data sources and inference of joint interventional distributions.
- In synthetic experiments, the method improves over the state of the art for merging datasets and achieves performance comparable to the KCI test, which requires access to joint observational data for all variables.
Related Articles
From Theory to Reality: Why Most AI Agent Projects Fail (And How Mine Did Too)
Dev.to
GPT-5.4-Cyber: OpenAI's Game-Changer for AI Security and Defensive AI
Dev.to
Building Digital Souls: The Brutal Reality of Creating AI That Understands You Like Nobody Else
Dev.to
Local LLM Beginner’s Guide (Mac - Apple Silicon)
Reddit r/artificial
Is Your Skill Actually Good? Systematically Validating Agent Skills with Evals
Dev.to