Holistic Optimal Label Selection for Robust Prompt Learning under Partial Labels
arXiv cs.CV / 4/9/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper introduces Holistic Optimal Label Selection (HopS) to improve prompt learning for vision-language models when training data has only partial/ambiguous labels.
- HopS uses a local density-based filtering strategy over nearest-neighbor label candidates, combining label frequency and softmax confidence to pick plausible labels.
- It also adds a global label assignment objective using optimal transport to align a uniform sampling distribution with candidate label distributions across a batch by minimizing expected transport cost.
- Experiments on eight benchmark datasets show HopS consistently boosts performance under partial supervision and surpasses prior baselines, indicating stronger robustness in weakly supervised settings.
Related Articles

Why Anthropic’s new model has cybersecurity experts rattled
Reddit r/artificial
Does the AI 2027 paper still hold any legitimacy?
Reddit r/artificial

Why Most Productivity Systems Fail (And What to Do Instead)
Dev.to

Moving from proof of concept to production: what we learned with Nometria
Dev.to

Frontend Engineers Are Becoming AI Trainers
Dev.to