ZO-SAM: Zero-Order Sharpness-Aware Minimization for Efficient Sparse Training
arXiv cs.LG / 3/16/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- ZO-SAM combines zero-order optimization with Sharpness-Aware Minimization to reduce the perturbation backpropagation to a single step.
- It halves the backpropagation cost compared with conventional SAM and lowers gradient variance, addressing bottlenecks in sparse training.
- The approach leverages SAM's flat-minima pursuit to stabilize training and speed up convergence for sparse networks.
- Models trained with ZO-SAM show improved robustness under distribution shift, broadening real-world deployment.
Related Articles
I Was Wrong About AI Coding Assistants. Here's What Changed My Mind (and What I Built About It).
Dev.to

Interesting loop
Reddit r/LocalLLaMA
Qwen3.5-122B-A10B Uncensored (Aggressive) — GGUF Release + new K_P Quants
Reddit r/LocalLLaMA
A supervisor or "manager" Al agent is the wrong way to control Al
Reddit r/artificial
FeatherOps: Fast fp8 matmul on RDNA3 without native fp8
Reddit r/LocalLLaMA