Arithmetic in the Wild: Llama uses Base-10 Addition to Reason About Cyclic Concepts
arXiv cs.AI / 5/5/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper studies how Llama-3.1-8B reasons about cyclic concepts (like “what month is six months after August”) and finds the model does not directly perform modular arithmetic in the cycle’s period (e.g., 12 months).
- Instead, Llama-3.1-8B appears to use a generic, task-agnostic addition mechanism: it first computes a base-10 sum of the two inputs, then maps that intermediate result back into the cyclic concept space.
- The authors argue that the summation is carried out using task-agnostic Fourier features whose periods align with base-10 addition (e.g., 2, 5, 10) rather than the cyclic concept’s own period.
- Mechanistically, the study identifies a small, reusable set of 28 MLP neurons at layer 18 (about 0.2% of that layer’s MLP) that cluster into groups, each implementing a sum for a Fourier feature with a distinct period.
- Overall, the work connects causal abstraction and feature-geometry choices to improve mechanistic interpretability of language models.
Related Articles

When Claims Freeze Because a Provider Record Drifted: The Case for Enrollment Repair Agents
Dev.to

The Cash Is Already Earned: Why Construction Pay Application Exceptions Fit an Agent Better Than SaaS
Dev.to

Why Ship-and-Debit Claim Recovery Is a Better Agent Wedge Than Another “AI Back Office” Tool
Dev.to
AI is getting better at doing things, but still bad at deciding what to do?
Reddit r/artificial

I Built an AI-Powered Chinese BaZi (八字) Fortune Teller — Here's What DeepSeek Revealed About Destiny
Dev.to