Qwen 3.6-35B - A3B Opensource Launched.

Reddit r/artificial / 4/16/2026

📰 NewsSignals & Early TrendsTools & Practical UsageModels & Research

Key Points

  • Qwen3.6-35B-A3B has been launched as an open-source model, using an Apache 2.0 license for broad reuse.
  • The model is described as a sparse MoE architecture with 35B total parameters but only 3B active parameters during inference.
  • The release claims strong agentic coding performance—comparable to models with roughly 10× more active size.
  • It is positioned as having robust multimodal perception and reasoning capabilities, including “multimodal thinking” plus distinct “thinking” and “non-thinking” modes.
  • Access is provided via Qwen Studio (chat.qwen.ai) and a Hugging Face model page for direct download and experimentation.

⚡ Meet Qwen3.6-35B-A3B:Now Open-Source!🚀🚀

A sparse MoE model, 35B total params, 3B active. Apache 2.0 license.

🔥 Agentic coding on par with models 10x its active size

📷 Strong multimodal perception and reasoning ability

🧠 Multimodal thinking + non-thinking modes

Efficient. Powerful. Versatile. Try it now👇

Qwen Studio:chat.qwen.ai

HuggingFace:https://huggingface.co/Qwen/Qwen3.6-35B-A3B

submitted by /u/Infinite-pheonix
[link] [comments]