Beyond the Class Subspace: Teacher-Guided Training for Reliable Out-of-Distribution Detection in Single-Domain Models
arXiv cs.LG / 3/13/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper identifies a failure mode called Domain-Sensitivity Collapse (DSC) in single-domain training, where supervised learning compresses features into a low-rank class subspace and suppresses directions carrying domain-shift signals.
- It provides theory showing that under DSC, distance- and logit-based OOD scores lose sensitivity to domain shift.
- The authors propose Teacher-Guided Training (TGT), which distills class-suppressed residual structure from a frozen multi-domain teacher (DINOv2) into the student during training, with no inference overhead since the teacher and auxiliary head are discarded after training.
- Across eight single-domain benchmarks, TGT yields large far-OOD FPR@95 reductions for distance-based scorers (MDS, ViM, kNN) on average: MDS +11.61 pp, ViM +10.78 pp, and kNN +12.87 pp (ResNet-50 average), while maintaining or slightly improving in-domain OOD and classification accuracy.
Related Articles
Is AI becoming a bubble, and could it end like the dot-com crash?
Reddit r/artificial

Externalizing State
Dev.to

I made a 'benchmark' where LLMs write code controlling units in a 1v1 RTS game.
Dev.to

My AI Does Not Have a Clock
Dev.to
How to settle on a coding LLM ? What parameters to watch out for ?
Reddit r/LocalLLaMA