| Interestingly enough, Mistral Small is written as Mistral-Small-4-119B-2603. Their medium model will have 128B paramters. Either it will be a dense model, or a less sparse MoE than Mistral Small [link] [comments] |
Mistral Medium Is On The Way
Reddit r/LocalLLaMA / 4/29/2026
💬 OpinionSignals & Early TrendsModels & Research
Key Points
- The article reports expectations that Mistral’s next “Medium” model will have about 128B parameters, following naming patterns seen for “Mistral Small.”
- It suggests the Medium model may be either a dense architecture or a Mixture-of-Experts (MoE) variant with less sparsity than Mistral Small.
- The details appear to come from a Reddit submission/speculation rather than an official, confirmed announcement.
Related Articles
How are LLMs 'corrected' when users identify them spreading misinformation or saying something harmful?
Reddit r/artificial
The future of software development: Now with less software development
The Register
The Landing: Portable Payload for AI Systems
Reddit r/artificial

AI Failures Happen When No One is Looking. Here's How to Fix Them.
Dev.to
I Made a CLI That Yells at Your Code Until It Gets an A
Dev.to