Your local LLM predictions and hopes for May 2026

Reddit r/LocalLLaMA / 5/2/2026

💬 OpinionSignals & Early TrendsIndustry & Market MovesModels & Research

Key Points

  • The article is a community prompt asking readers to predict which local LLM families and sizes might be released in May 2026.
  • It lists a wide set of potential model releases and variants, including Gemma, Qwen (including Qwen Coder), GLM, Kimi, Nvidia/Nemotron-related models, and several other model lineups.
  • Readers are also invited to rank what they most want, spanning both new model announcements and potentially improved implementations or techniques.
  • The discussion includes hopes for hardware or platform advancements as well as unexpected new local LLMs from non-Nvidia vendors.
  • Overall, it functions as an informal forecast/requirements poll rather than a report of any new event.

Which of these do you think we'll get in May? Also, feel free to pick/rank which ones you'd want the most badly:

  • more Gemma4 models (124b?) (other sizes?)

  • more Qwen3.6 models (9b? 122b? 397b?)

  • new Qwen Coder model (80b Even Nexter?) (~397b/400b+ coder?)

  • new GLM model in the 100b-300b size range?

  • small Kimi model of some sort?

  • more Nvidia/Nemotron models?

  • new Stepfun model?

  • new OpenAI OSS model(s)?

  • Meta Avocado/Paricado model(s)?

  • more MiniMax model(s)? (maybe some different sizes)?

  • more MiMo model(s)? (maybe some different sizes)?

  • more Mistral models?

  • new Devstral models?

  • more DeepSeekv4 sizes?

  • more Granite models?

  • new Phi model(s)?

  • new NousResearch finetunes of any really big models?

  • more Bonsai models?

  • a model with a significantly improved version/implementation of engram?

  • Any new Taalas-style model-on-a-chip burners? (and maybe of bigger models)?

  • Any surprise new models from any other hardware players other than Nvidia (i.e. a local LLM from AMD, Intel, Samsung, Micron, or someone like that)?

  • other models?

  • Any interesting tech/methods/concepts/improvements you're predicting or hoping for?

submitted by /u/DeepOrangeSky
[link] [comments]