Seq vs Seq: An Open Suite of Paired Encoders and Decoders
arXiv cs.CL / 3/13/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The authors introduce the Ettin suite—paired encoder-only and decoder-only models from 17M to 1B parameters trained on up to 2 trillion tokens—and show SOTA recipes for both categories by using the same training recipe, beating ModernBERT for encoders and Llama 3.2 and SmolLM2 for decoders.
Related Articles
How AI is Transforming Dynamics 365 Business Central
Dev.to
Algorithmic Gaslighting: A Formal Legal Template to Fight AI Safety Pivots That Cause Psychological Harm
Reddit r/artificial
Do I need different approaches for different types of business information errors?
Dev.to
ShieldCortex: What We Learned Protecting AI Agent Memory
Dev.to
How AI-Powered Revenue Intelligence Transforms B2B Sales Teams
Dev.to