Handling and Interpreting Missing Modalities in Patient Clinical Trajectories via Autoregressive Sequence Modeling
arXiv cs.AI / 4/22/2026
💬 OpinionModels & Research
Key Points
- The paper tackles the challenge of missing modalities in multimodal healthcare ML by reframing clinical diagnosis as autoregressive sequence modeling of a patient’s multimodal trajectory.
- It proposes a missingness-aware contrastive pre-training objective that learns a shared latent space across modalities even when some are absent.
- Using causal decoders adapted from large language models, the authors model temporal clinical signals while aiming to preserve interpretability.
- Experiments on MIMIC-IV and eICU fine-tuning benchmarks show that transformer-based autoregressive sequence modeling outperforms baseline approaches.
- Interpretability analysis finds that removing modalities can cause divergent model behavior across patient stays, and that the contrastive pre-training helps mitigate this issue.
Related Articles
I’m working on an AGI and human council system that could make the world better and keep checks and balances in place to prevent catastrophes. It could change the world. Really. Im trying to get ahead of the game before an AGI is developed by someone who only has their best interest in mind.
Reddit r/artificial
Deepseek V4 Flash and Non-Flash Out on HuggingFace
Reddit r/LocalLLaMA

DeepSeek V4 Flash & Pro Now out on API
Reddit r/LocalLLaMA

From "Hello World" to "Hello Agents": The Developer Keynote That Rewired Software Engineering
Dev.to

AI swarms could hijack democracy without anyone noticing
Reddit r/artificial