Knowledge Capsules: Structured Nonparametric Memory Units for LLMs
arXiv cs.CL / 4/23/2026
📰 NewsDeveloper Stack & InfrastructureModels & Research
Key Points
- The paper argues that conventional LLM knowledge storage in parametric weights is expensive to update, and that standard RAG is indirect because retrieved knowledge competes with input tokens in attention.
- It introduces “Knowledge Capsules,” structured nonparametric memory units that encode normalized relational knowledge built directly from document corpora using a frozen base model.
- Instead of appending knowledge as text, the approach uses an External Key Value Injection (KVI) framework to compile capsules into attention-compatible key/value representations for direct participation in attention computation.
- The authors report consistent improvements over RAG and GraphRAG on multiple QA benchmarks, especially for long-context and multi-hop reasoning, without requiring any parameter updates.
- The contribution is positioned as shifting knowledge integration from context-level token augmentation to memory-level interaction, aiming to improve stability and accuracy.
Related Articles

Trajectory Forecasts in Unknown Environments Conditioned on Grid-Based Plans
Dev.to

Elevating Austria: Google invests in its first data center in the Alps.
Google Blog

OpenAI Just Named It Workspace Agents. We Open-Sourced Our Lark Version Six Months Ago
Dev.to

GPT Image 2 Subject-Lock Editing: A Practical Guide to input_fidelity
Dev.to

AI Tutor That Works Offline — Study Anywhere with EaseLearn AI
Dev.to