Convolutional Maximum Mean Discrepancy for Inference in Noisy Data
arXiv stat.ML / 4/15/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper proposes a new inference framework for data contaminated by measurement error, including potentially heteroscedastic noise drawn from a known distribution.
- It introduces convolutional Maximum Mean Discrepancy (convMMD), which compares distributions after convolving with the noise while preserving metric validity under standard kernel assumptions.
- The authors derive finite-sample deviation bounds that remain unaffected by measurement error and show an equivalence between hypothesis testing under noise and kernel smoothing.
- They present a convMMD-based estimator with proofs of consistency and asymptotic normality, along with an efficient implementation using stochastic gradient descent.
- Experiments and real-world applications (notably in astronomy and social sciences) demonstrate the method’s practical effectiveness under noisy observational settings.
Related Articles
Vibe Coding Is Changing How We Build Software. ERP Teams Should Pay Attention
Dev.to
I scanned every major vibe coding tool for security. None scored above 90.
Dev.to
I Finally Checked What My AI Coding Tools Actually Cost. The Number Made No Sense.
Dev.to
Is it actually possible to build a model-agnostic persistent text layer that keeps AI behavior stable?
Reddit r/artificial
Give me your ideass [N]
Reddit r/MachineLearning