Tencent Releases Hy3 preview - Open Source 295B 21B Active MoE

Reddit r/LocalLLaMA / 4/23/2026

📰 NewsSignals & Early TrendsTools & Practical UsageIndustry & Market MovesModels & Research

Key Points

  • Tencent has released a preview version of its Hy3 model as open-source weights, available via Hugging Face.
  • The release is described as a large model (295B) with an estimated 21B active Mixture-of-Experts (MoE) capacity during inference.
  • The model weights can be downloaded and used by developers and the broader open-source community for experimentation and integration.
  • The preview status suggests it is intended for early testing rather than a final, fully stabilized release.