SAGE Celer 2.6 Technical Card

arXiv cs.AI / 4/17/2026

📰 NewsTools & Practical UsageModels & Research

Key Points

  • SAGEA has introduced SAGE Celer 2.6, a new general-purpose Celer model available in 5B, 10B, and 27B parameter sizes.
  • The model incorporates extensive architectural changes and additional pre-training on an undisclosed model, aiming to improve overall reasoning quality.
  • SAGEA claims that its Inverse Reasoning (IR) pipeline helps the model validate its own logic paths to reduce cascading errors and hallucinations on complex reasoning tasks.
  • SAGE Celer 2.6 includes native multimodal capabilities with an end-to-end vision encoder, and it is optimized for South Asian languages via a custom Devanagari tokenizer, emphasizing strong Nepali and Hindi performance while maintaining English reasoning ability.
  • The announcement reports competitive results on mathematics, coding, and general intelligence benchmarks (ACUMEN) with low latency.

Abstract

We introduce SAGE Celer 2.6, the latest in our line of general-purpose Celer models from SAGEA. Celer 2.6 is available in 5B, 10B, and 27B parameter sizes and benefits from extensive architectural modifications and further pre-training on an undisclosed model. Using our Inverse Reasoning (IR) pipeline, SAGEA natively trains Celer 2.6 to validate its own logic paths, minimizing cascading error and hallucination in complex reasoning tasks. Celer 2.6 also boasts natively integrated multimodal functionality with an end-to-end vision encoder to avoid common pitfalls in adapter-based approaches. Celer 2.6 provides highly competitive results on mathematics, coding, and general intelligence benchmarks (ACUMEN), along with low latency. Most importantly, Celer 2.6 is specifically optimized for South Asian language support, with a custom tokenizer for the Devanagari script and strong performance in both Nepali and Hindi without sacrificing English reasoning ability.