Neural Conditional Transport Maps

arXiv stat.ML / 4/2/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper proposes a neural framework for learning conditional optimal transport (OT) maps between probability distributions, supporting conditioning on both categorical and continuous variables at the same time.
  • It uses a hypernetwork that generates the parameters of the transport layers from the conditioning inputs, producing adaptive mappings that outperform simpler conditioning baselines.
  • Extensive ablation studies are reported to validate that the proposed hypernetwork-based conditioning and architecture design drive the performance gains.
  • The authors demonstrate an application to global sensitivity analysis, where the method achieves strong performance when computing OT-based sensitivity indices.
  • The work positions conditional OT learning as a step forward toward applying OT methods in high-dimensional settings like generative modeling and black-box model explainability.

Abstract

We present a neural framework for learning conditional optimal transport (OT) maps between probability distributions. Our approach introduces a conditioning mechanism capable of processing both categorical and continuous conditioning variables simultaneously. At the core of our method lies a hypernetwork that generates transport layer parameters based on these inputs, creating adaptive mappings that outperform simpler conditioning methods. Comprehensive ablation studies demonstrate the superior performance of our method over baseline configurations. Furthermore, we showcase an application to global sensitivity analysis, offering high performance in computing OT-based sensitivity indices. This work advances the state-of-the-art in conditional optimal transport, enabling broader application of optimal transport principles to complex, high-dimensional domains such as generative modeling and black-box model explainability.