Patch release v5.6.1

Transformers(HuggingFace)Releases / 4/23/2026

📰 NewsDeveloper Stack & InfrastructureModels & Research

Key Points

  • The project has issued a patch release version v5.6.1 after a broken “flash attention” path.
  • The release includes a specific fix for an AttributeError occurring when s_aux is None in flash_attention_forward.
  • The fix was contributed via PR #45589 (by jamesbraza), addressing the reported runtime failure.
  • The maintainers acknowledged the issue and apologized to users for the disruption caused by this regression.

Patch release v5.6.1

Flash attention path was broken! Sorry everyone for this one 🤗

  • Fix AttributeError on s_aux=None in flash_attention_forward (#45589) by @jamesbraza