Zero-Shot Quantization via Weight-Space Arithmetic
arXiv cs.CV / 4/7/2026
📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- The paper introduces “quantization vectors,” a transferable direction in weight space that can be extracted from a donor task using simple weight-space arithmetic.
- Applying this vector to a receiver model can significantly improve robustness to post-training quantization (PTQ)-induced noise—reported gains are up to ~60%—without requiring receiver-side QAT.
- The approach is presented as zero-shot and low-cost because it does not need receiver training data.
- Experiments are demonstrated on Vision Transformer (ViT) models, supporting the claim that quantization robustness can be a reusable property of weight-space geometry.
- Overall, the results position quantization robustness as transferable across tasks rather than something that must be relearned via task-specific training.



