Dual-Teacher Distillation with Subnetwork Rectification for Black-Box Domain Adaptation
arXiv cs.CV / 3/25/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper studies black-box domain adaptation where the source data and source model are inaccessible, and transferable knowledge is obtained only via querying the black-box source model with target samples.
- It proposes Dual-Teacher Distillation with Subnetwork Rectification (DDSR), which combines predictions from the black-box source model (specific knowledge) and a vision-language model (general semantic priors) to produce more reliable pseudo labels.
- DDSR introduces subnetwork-driven regularization to reduce overfitting that can arise from noisy pseudo-label supervision, improving robustness during adaptation.
- The method iteratively refines both target pseudo labels and the ViL prompts, then further optimizes the target model using self-training with classwise prototypes.
- Experiments across multiple benchmarks show DDSR delivers consistent gains over prior state-of-the-art approaches, including those that assume access to source data or source models.
Related Articles
CRM Development That Drives Growth
Dev.to

Karpathy's Autoresearch: Improving Agentic Coding Skills
Dev.to
How to Write AI Prompts That Actually Work
Dev.to
[D] Any other PhD students feel underprepared and that the bar is too low?
Reddit r/MachineLearning
Automating the Perfect Pitch: An AI Framework for Boutique PR
Dev.to