CLAPS: Aleatoric-Epistemic Scaling via Last-Layer Laplace for Conformal Regression
arXiv stat.ML / 5/6/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper tackles a limitation of conformal regression: while it guarantees finite-sample marginal coverage, it doesn’t inherently specify how to adapt prediction interval width to heterogeneous inputs.
- It introduces CLAPS (Conformal Laplace-Aware Predictive Scaling), a split conformal regression approach that uses heteroscedastic last-layer Laplace uncertainty as the input-dependent normalization scale.
- CLAPS explicitly combines aleatoric uncertainty (learned, input-dependent noise) with epistemic uncertainty (from last-layer Laplace), aiming to produce more appropriate interval widths across regions with weak training support.
- The authors derive how this aleatoric–epistemic scale relates to last-layer precision and show that the method reduces to aleatoric-only local scaling as epistemic uncertainty diminishes.
- Experiments report nominal coverage that matches calibration targets and interval efficiency that is competitive compared with existing methods.
Related Articles

Top 10 Free AI Tools for Students in 2026: The Ultimate Study Guide
Dev.to

AI as Your Contingency Co-Pilot: Automating Wedding Day 'What-Ifs'
Dev.to

Google AI Releases Multi-Token Prediction (MTP) Drafters for Gemma 4: Delivering Up to 3x Faster Inference Without Quality Loss
MarkTechPost
When Claude Hallucinates in Court: The Latham & Watkins Incident and What It Means for Attorney Liability
MarkTechPost
Solidity LM surpasses Opus
Reddit r/LocalLLaMA