DT-BEHRT: Disease Trajectory-aware Transformer for Interpretable Patient Representation Learning
arXiv cs.LG / 3/12/2026
📰 NewsModels & Research
Key Points
- DT-BEHRT introduces a graph-enhanced, trajectory-aware Transformer for learning interpretable patient representations from longitudinal EHR data.
- The model explicitly disentangles diagnosis-centric interactions within organ systems and captures asynchronous disease progression to reflect real-world clinical trajectories.
- A tailored pre-training scheme combines trajectory-level code masking with ontology-informed ancestor prediction to improve semantic alignment across modules.
- On benchmark datasets, DT-BEHRT achieves strong predictive performance and yields clinically interpretable representations aligned with disease-centered reasoning, with the code released on GitHub.
Related Articles
Self-Refining Agents in Spec-Driven Development
Dev.to

has anyone tried this? Flash-MoE: Running a 397B Parameter Model on a Laptop
Reddit r/LocalLLaMA

M2.7 open weights coming in ~2 weeks
Reddit r/LocalLLaMA

MiniMax M2.7 Will Be Open Weights
Reddit r/LocalLLaMA
Best open source coding models for claude code? LB?
Reddit r/LocalLLaMA