OC-Distill: Ontology-aware Contrastive Learning with Cross-Modal Distillation for ICU Risk Prediction
arXiv cs.LG / 4/21/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper introduces OC-Distill, a two-stage machine learning framework for early ICU risk prediction that targets better severity deterioration and length-of-stay forecasting.
- It improves contrastive pretraining by using an ontology-aware objective grounded in the ICD hierarchy to model clinically meaningful patient similarity rather than treating all patients as equally strong negatives.
- It enhances representations by performing cross-modal knowledge distillation from clinical notes into a model that still only requires vital signs at inference.
- Experiments on multiple ICU prediction tasks using the MIMIC dataset show higher label efficiency and state-of-the-art performance among approaches that rely solely on vital signs at inference.
Related Articles

Rethinking Coding Education for the AI Era
Dev.to

We Shipped an MVP With Vibe-Coding. Here's What Nobody Tells You About the Aftermath
Dev.to

Agent Package Manager (APM): A DevOps Guide to Reproducible AI Agents
Dev.to

3 Things I Learned Benchmarking Claude, GPT-4o, and Gemini on Real Dev Work
Dev.to

Open Source Contributors Needed for Skillware & Rooms (AI/ML/Python)
Dev.to