Mistral Medium 3.5 128b ggufs are fixed

Reddit r/LocalLLaMA / 5/2/2026

📰 NewsDeveloper Stack & InfrastructureTools & Practical Usage

Key Points

  • The article reports that existing Mistral Medium 3.5 128B GGUF files had issues that led to bad outputs, particularly when using long context.
  • It states that the problem has been fixed, with updated information provided via a Hugging Face discussion thread.
  • The author adds that, based on their experience, the updated GGUF version is much more stable even with short context.
  • They note that incorrect prompt formatting previously caused outputs to quickly degrade into gibberish, whereas the updated version is less sensitive to prompt-format mistakes.
  • Overall, the update improves reliability for local inference using these GGUF artifacts.

All ggufs were broken, resulting in bad outputs, especially at long context.

Anyway, it is fixed now: https://huggingface.co/unsloth/Mistral-Medium-3.5-128B-GGUF/discussions/1

Edit: Unsloth Announcement: https://huggingface.co/unsloth/Mistral-Medium-3.5-128B-GGUF/discussions/5

Edit2: From my experience it is A LOT more stable, even at short context. I messed up the prompt format before and it quickly devolved into gibberish. The updated version doesn't really mind.

submitted by /u/Sunija_Dev
[link] [comments]