HyCal: A Training-Free Prototype Calibration Method for Cross-Discipline Few-Shot Class-Incremental Learning
arXiv cs.CV / 4/20/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper argues that existing few-shot class-incremental learning (FSCIL) methods break down in real settings where data comes from heterogeneous disciplines with imbalanced availability and different visual complexity.
- It introduces “Domain Gravity,” a representational asymmetry where overrepresented (or low-entropy) domains dominate the embedding space, causing prototype drift and performance degradation on underrepresented/high-entropy domains.
- To address this, it proposes XD-VSCIL, a cross-discipline benchmark designed to reflect real-world heterogeneity and to amplify Domain Gravity.
- It also presents HyCal (Hybrid Prototype Calibration), a training-free approach on frozen CLIP embeddings that combines cosine similarity and Mahalanobis distance to stabilize prototypes by capturing both directional alignment and covariance-aware magnitude.
- Experiments report that HyCal mitigates Domain Gravity and outperforms prior FSCIL approaches, improving retention–adaptation while remaining computationally efficient.
Related Articles

From Theory to Reality: Why Most AI Agent Projects Fail (And How Mine Did Too)
Dev.to

GPT-5.4-Cyber: OpenAI's Game-Changer for AI Security and Defensive AI
Dev.to

Building Digital Souls: The Brutal Reality of Creating AI That Understands You Like Nobody Else
Dev.to
Local LLM Beginner’s Guide (Mac - Apple Silicon)
Reddit r/artificial

Is Your Skill Actually Good? Systematically Validating Agent Skills with Evals
Dev.to