Conformalized Percentile Interval: Finite Sample Validity and Improved Conditional Performance
arXiv stat.ML / 5/6/2026
📰 NewsModels & Research
Key Points
- The paper introduces a conformal-style method that builds finite-sample-adjusted percentile prediction intervals using PIT (probability integral transform) values from a neural-network-estimated conditional CDF.
- It argues that calibrating in PIT space reduces feature-dependent miscoverage when the CDF estimator is accurate, improving conditional validity and calibration.
- The approach is designed to remain robust even if the conditional CDF is imperfect, by leveraging the empirical PIT distribution for percentile calibration.
- The authors provide theoretical guarantees, including finite-sample marginal coverage and asymptotic conditional coverage under mild consistency assumptions.
- Experiments on synthetic and real-world benchmarks show improved conditional calibration and substantially shorter intervals compared with existing methods.
Related Articles

Google AI Releases Multi-Token Prediction (MTP) Drafters for Gemma 4: Delivering Up to 3x Faster Inference Without Quality Loss
MarkTechPost
Solidity LM surpasses Opus
Reddit r/LocalLLaMA

Quality comparison between Qwen 3.6 27B quantizations (BF16, Q8_0, Q6_K, Q5_K_XL, Q4_K_XL, IQ4_XS, IQ3_XXS,...)
Reddit r/LocalLLaMA

We measured the real cost of running a GPT-5.4 chatbot on live websites
Reddit r/artificial

AI ecosystems in China and US grow apart amid tech war
SCMP Tech