QERNEL: a Scalable Large Electron Model
arXiv cs.AI / 4/30/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper introduces QERNEL, a foundational “neural wavefunction” model that variationally solves families of many-electron Hamiltonians and learns their ground states across a parameter space in one model.
- QERNEL uses FiLM-based parameter conditioning along with scale-efficient components—mixture-of-experts and grouped-query attention—to improve expressivity without a large compute increase.
- The authors demonstrate the approach on interacting electrons in semiconductor moiré heterobilayers by training a single weight-shared model for systems up to 150 electrons.
- By conditioning the many-electron Schrödinger solution on the moiré potential depth, QERNEL reproduces both quantum liquid and crystal phases and identifies a sharp phase transition via abrupt changes in interaction energy and charge density.
- The work is positioned as a foundation for moiré quantum materials and as an architectural step toward a “Large Electron Model” for solids.
Related Articles
Vector DB and ANN vs PHE conflict, is there a practical workaround? [D]
Reddit r/MachineLearning

Agent Amnesia and the Case of Henry Molaison
Dev.to
Azure Weekly: Microsoft and OpenAI Restructure Partnership as GPT-5.5 Lands in Foundry
Dev.to
Proven Patterns for OpenAI Codex in 2026: Prompts, Validation, and Gateway Governance
Dev.to
Vibe coding is a tool, not a shortcut. Most people are using it wrong.
Dev.to