Accurate Legal Reasoning at Scale: Neuro-Symbolic Offloading and Structural Auditability for Robust Legal Adjudication

arXiv cs.CL / 5/5/2026

📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper introduces “Amortized Intelligence,” a neuro-symbolic method that translates legal text into a typed graph intermediate form (DACL) using an LLM once, rather than relying on repeated probabilistic reasoning at inference time.
  • Legal adjudication is performed via deterministic graph execution, producing a visually auditable trace aimed at meeting the strict transparency and auditability requirements of legal decision-making.
  • Experiments compare the DACL-based agent against runtime large reasoning model baselines (including GPT-5.2 and Gemini 3 Pro), reporting near-perfect consistency and reduced failures associated with the “reasoning cliff.”
  • The approach claims substantial efficiency gains, cutting compute costs by over 90% in high-volume legal workflows while maintaining robust, consistent outputs.
  • Overall, the work targets production-readiness for legal AI systems by combining structured representation, deterministic execution, and traceability to reduce reasoning errors and inference expenses.

Abstract

Legal texts often contain computational legal clauses--provisions whose understanding requires complex logic. While frontier Large Reasoning Models (LRMs) can describe such clauses, building production-ready systems is limited by reasoning errors and the high cost of inference. We propose Amortized Intelligence, a neuro-symbolic approach where we use an LLM once to translate a legal text into Deterministic Autonomous Contract Language (DACL): a typed graph intermediate representation. Adjudication then relies on deterministic graph executions with a visually auditable trace. In comparison against runtime LRM baselines (including GPT-5.2 and Gemini 3 Pro), our DACL-based Agent achieves near-perfect consistency and mitigates the "reasoning cliff" observed in probabilistic models. The system reduces compute costs by over 90% in high-volume workflows while satisfying the strict auditability requirements of legal adjudication.