EngGPT2: Sovereign, Efficient and Open Intelligence
arXiv cs.CL / 3/18/2026
📰 NewsIdeas & Deep AnalysisIndustry & Market MovesModels & Research
Key Points
- EngGPT2-16B-A3B is a newly announced Italian LLM from Engineering Group designed to be sovereign, efficient, and open, with explicit alignment to the EU AI Act.
- It uses a trained-from-scratch Mixture-of-Experts architecture with 16 billion parameters and about 3 billion active per inference, delivering competitive performance on benchmarks such as MMLU-Pro, GSM8K, IFEval and HumanEval relative to 8-16B dense models.
- It was trained on 2.5 trillion tokens, with roughly 25% Italian-language data to strengthen European and Italian NLP capabilities at this scale.
- EngGPT2 claims substantial efficiency improvements, requiring one-fifth to one-half of the inference power and one-tenth to one-sixth of the training data and training power of comparable models.
- It supports multiple reasoning modes, including non-reasoning, Italian-English reasoning, and turbo-reasoning, positioning it for real-time, multilingual use cases and as a model in open-weight European AI ecosystems.
Related Articles
How AI is Transforming Dynamics 365 Business Central
Dev.to
Algorithmic Gaslighting: A Formal Legal Template to Fight AI Safety Pivots That Cause Psychological Harm
Reddit r/artificial
Do I need different approaches for different types of business information errors?
Dev.to
ShieldCortex: What We Learned Protecting AI Agent Memory
Dev.to
WordPress Theme Customization Without Code: The AI Revolution
Dev.to