FIBER: A Differentially Private Optimizer with Filter-Aware Innovation Bias Correction
arXiv cs.LG / 5/6/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper explains that temporally filtering differentially private (DP) gradients can change the DP noise statistics used by adaptive optimizers like AdamW, which can miscalibrate common bias-correction terms.
- It proposes FiBeR, a differentially private optimizer specifically designed for temporally filtered privatized gradients.
- FiBeR performs denoising in the innovation space, decouples optimizer geometry from innovation gain for independent tuning, and applies filter-aware second-moment calibration that subtracts an attenuated DP-noise term A(ω)·σ_w².
- The authors provide a closed-form method to compute A(ω) for general stable linear filters and report substantial benchmark improvements over existing DP optimizers under the same privacy constraints across vision and language tasks.
Related Articles

Top 10 Free AI Tools for Students in 2026: The Ultimate Study Guide
Dev.to

AI as Your Contingency Co-Pilot: Automating Wedding Day 'What-Ifs'
Dev.to

Google AI Releases Multi-Token Prediction (MTP) Drafters for Gemma 4: Delivering Up to 3x Faster Inference Without Quality Loss
MarkTechPost
When Claude Hallucinates in Court: The Latham & Watkins Incident and What It Means for Attorney Liability
MarkTechPost
Solidity LM surpasses Opus
Reddit r/LocalLLaMA