Self-Conditioned Denoising for Atomistic Representation Learning
arXiv cs.LG / 3/19/2026
📰 NewsModels & Research
Key Points
- The paper introduces Self-Conditioned Denoising (SCD), a backbone-agnostic pretraining objective that uses self-embeddings to enable conditional denoising across atomistic data.
- SCD applies across diverse domains, including small molecules, proteins, periodic materials, and non-equilibrium geometries, addressing SSL limitations that were confined to ground-state geometries or a single domain.
- With controlled backbone architecture and pretraining data, SCD significantly outperforms previous SSL methods and matches or exceeds supervised force-energy pretraining on downstream benchmarks.
- A small, fast Graph Neural Network pretrained with SCD can achieve competitive or superior performance to larger models trained on substantially larger labeled or unlabeled datasets.
- Code for SCD is available at https://github.com/TyJPerez/SelfConditionedDenoisingAtoms
Related Articles

Jeff Bezos reportedly wants $100 billion to buy and transform old manufacturing firms with AI
TechCrunch
[R] Weekly digest: arXiv AI security papers translated for practitioners -- Cascade (cross-stack CVE+Rowhammer attacks on compound AI), LAMLAD (dual-LLM adversarial ML, 97% evasion), OpenClaw (4 vuln classes in agent frameworks)
Reddit r/MachineLearning
My Experience with Qwen 3.5 35B
Reddit r/LocalLLaMA

Cursor’s new coding model Composer 2 is here: It beats Claude Opus 4.6 but still trails GPT-5.4
VentureBeat
Qwen 3.5 122B completely falls apart at ~ 100K context
Reddit r/LocalLLaMA