One-to-More: High-Fidelity Training-Free Anomaly Generation with Attention Control
arXiv cs.CV / 3/20/2026
📰 NewsModels & Research
Key Points
- O2MAG is introduced as a training-free few-shot anomaly generation method that uses self-attention from a reference anomalous image to synthesize more realistic anomalies for industrial anomaly detection.
- The method leverages three parallel diffusion processes with self-attention grafting and incorporates an anomaly mask to reduce foreground-background query confusion while enabling text-guided anomaly synthesis.
- Anomaly-Guided Optimization is proposed to better align generated anomalies with the target anomalous distribution, enhancing realism and text consistency.
- Dual-Attention Enhancement reinforces both self- and cross-attention on masked regions to mitigate faint anomaly synthesis inside anomaly masks, and extensive experiments show it outperforms prior state-of-the-art methods on downstream AD tasks.
Related Articles

**Core Allocation Optimization for Energy‑Efficient Multi‑Core Scheduling in ARINC650 Systems**
Dev.to

LongCat-Flash-Prover: A new frontier for Open-Source Formal Reasoning.
Reddit r/LocalLLaMA

composer 2 is just Kimi K2.5 with RL?????
Reddit r/LocalLLaMA

Built a small free iOS app to reduce LLM answer uncertainty with multiple models
Dev.to
![[P] We built a Weights & Biases for Autoresearch - track steps, compare experiments, and share results](/_next/image?url=https%3A%2F%2Fpreview.redd.it%2Flv7w6809f7qg1.png%3Fwidth%3D140%26height%3D75%26auto%3Dwebp%26s%3De77e7b54776d5a33eb092415d26190352ad20577&w=3840&q=75)
[P] We built a Weights & Biases for Autoresearch - track steps, compare experiments, and share results
Reddit r/MachineLearning