EngGPT2: Sovereign, Efficient and Open Intelligence
arXiv cs.CL / 3/18/2026
📰 NewsIdeas & Deep AnalysisIndustry & Market MovesModels & Research
Key Points
- EngGPT2-16B-A3B is a newly announced Italian LLM from Engineering Group designed to be sovereign, efficient, and open, with explicit alignment to the EU AI Act.
- It uses a trained-from-scratch Mixture-of-Experts architecture with 16 billion parameters and about 3 billion active per inference, delivering competitive performance on benchmarks such as MMLU-Pro, GSM8K, IFEval and HumanEval relative to 8-16B dense models.
- It was trained on 2.5 trillion tokens, with roughly 25% Italian-language data to strengthen European and Italian NLP capabilities at this scale.
- EngGPT2 claims substantial efficiency improvements, requiring one-fifth to one-half of the inference power and one-tenth to one-sixth of the training data and training power of comparable models.
- It supports multiple reasoning modes, including non-reasoning, Italian-English reasoning, and turbo-reasoning, positioning it for real-time, multilingual use cases and as a model in open-weight European AI ecosystems.
Related Articles

ベテランの若手育成負担を減らせ、PLC制御の「ラダー図」をAIで生成
日経XTECH

Hey dev.to community – sharing my journey with Prompt Builder, Insta Posts, and practical SEO
Dev.to

How to Build Passive Income with AI in 2026: A Developer's Practical Guide
Dev.to

The Research That Doesn't Exist
Dev.to

日産、E2Eロボタクシーで「水平分業」 ウーバー・NVIDIAと対テスラ
日経XTECH