| Nathan Lambert and Florian Brand has published a comprehensive analysis of open model adoption from Nov 2023 to Mar 2026 tracking around 1.5K models across Hugging Face downloads, OpenRouter data and other benchmarks. One of the biggest takeaways for me is the sheer dominance and scale of contributions from Chinese labs (especially Qwen) to the open-source ecosystem. To be honest, their initiative in open-sourcing models like Qwen and DeepSeek has also encouraged similar efforts from other labs across Europe and the US. I would even attribute the recent release and fast tracking of Gemma4 to the success of Qwen3.5. I would recommend everyone to go through the report (even just the graphs) just to see the scale of Chinese models influence and adoption in Open-Source community Report link: https://atomproject.ai/atom_report.pdf [link] [comments] |
ATOM Report highlights the sheer dominance of Chinese labs in the Open-Source LLM space
Reddit r/LocalLLaMA / 4/9/2026
💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- The ATOM Report by Nathan Lambert and Florian Brand analyzes open-model adoption from Nov 2023 to Mar 2026 by tracking roughly 1.5K models across Hugging Face downloads, OpenRouter data, and other benchmarks.
- It finds that Chinese labs—particularly Qwen—show dominant scale in open-source LLM contributions and user adoption across the ecosystem.
- The analysis argues that open-sourcing by Chinese teams (e.g., Qwen and DeepSeek) has helped catalyze similar open-model efforts from labs in Europe and the US.
- The report suggests a possible influence on other model releases, citing Gemma4’s release and rapid progression as potentially tied to Qwen3.5’s success.
- It recommends readers review the report and its graphs to gauge the magnitude of Chinese models’ impact on the open-source community.
💡 Insights using this article
This article is featured in our daily AI news digest — key takeaways and action items at a glance.




