Noise Steering for Controlled Text Generation: Improving Diversity and Reading-Level Fidelity in Arabic Educational Story Generation
arXiv cs.CL / 4/7/2026
💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- The paper studies “noise steering” for Arabic educational story generation, aiming to increase narrative diversity while maintaining strict constraints on vocabulary, reading level, and narrative structure.
- It proposes injecting calibrated Gaussian perturbations into transformer internal representations at inference time as a training-free method, and evaluates four injection strategies across five small Arabic-centric language models (7–9B).
- Residual-stream noise improves story diversity while incurring minimal loss in overall quality or constraint adherence and maintains early-grade reading level across all tested models.
- Attention-entropy noise injection (AENI) is reported to stabilize attention-logit noise injection and recover quality, outperforming less stable approaches.
- In contrast, high-temperature sampling increases reading grade level and can lead to “catastrophic collapse” in several models, suggesting internal perturbations are better than output-level randomness for constrained educational generation.
Related Articles

Black Hat Asia
AI Business

Amazon CEO takes aim at Nvidia, Intel, Starlink, more in annual shareholder letter
TechCrunch

Why Anthropic’s new model has cybersecurity experts rattled
Reddit r/artificial
Does the AI 2027 paper still hold any legitimacy?
Reddit r/artificial

Why Most Productivity Systems Fail (And What to Do Instead)
Dev.to