One-to-More: High-Fidelity Training-Free Anomaly Generation with Attention Control
arXiv cs.CV / 3/20/2026
📰 NewsModels & Research
Key Points
- O2MAG is introduced as a training-free few-shot anomaly generation method that uses self-attention from a reference anomalous image to synthesize more realistic anomalies for industrial anomaly detection.
- The method leverages three parallel diffusion processes with self-attention grafting and incorporates an anomaly mask to reduce foreground-background query confusion while enabling text-guided anomaly synthesis.
- Anomaly-Guided Optimization is proposed to better align generated anomalies with the target anomalous distribution, enhancing realism and text consistency.
- Dual-Attention Enhancement reinforces both self- and cross-attention on masked regions to mitigate faint anomaly synthesis inside anomaly masks, and extensive experiments show it outperforms prior state-of-the-art methods on downstream AD tasks.
Related Articles
Does Synthetic Data Generation of LLMs Help Clinical Text Mining?
Dev.to
The Dawn of the Local AI Era: From iPhone 17 Pro to the Future of NVIDIA RTX
Dev.to
[P] Prompt optimization for analog circuit placement — 97% of expert quality, zero training data
Reddit r/MachineLearning
[R] Looking for arXiv endorser (cs.AI or cs.LG)
Reddit r/MachineLearning

I curated an 'Awesome List' for Generative AI in Jewelry- papers, datasets, open-source models and tools included!
Reddit r/artificial