HyCal: A Training-Free Prototype Calibration Method for Cross-Discipline Few-Shot Class-Incremental Learning

arXiv cs.CV / 4/20/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper argues that existing few-shot class-incremental learning (FSCIL) methods break down in real settings where data comes from heterogeneous disciplines with imbalanced availability and different visual complexity.
  • It introduces “Domain Gravity,” a representational asymmetry where overrepresented (or low-entropy) domains dominate the embedding space, causing prototype drift and performance degradation on underrepresented/high-entropy domains.
  • To address this, it proposes XD-VSCIL, a cross-discipline benchmark designed to reflect real-world heterogeneity and to amplify Domain Gravity.
  • It also presents HyCal (Hybrid Prototype Calibration), a training-free approach on frozen CLIP embeddings that combines cosine similarity and Mahalanobis distance to stabilize prototypes by capturing both directional alignment and covariance-aware magnitude.
  • Experiments report that HyCal mitigates Domain Gravity and outperforms prior FSCIL approaches, improving retention–adaptation while remaining computationally efficient.

Abstract

Pretrained Vision-Language Models (VLMs) like CLIP show promise in continual learning, but existing Few-Shot Class-Incremental Learning (FSCIL) methods assume homogeneous domains and balanced data distributions, limiting real-world applicability where data arises from heterogeneous disciplines with imbalanced sample availability and varying visual complexity. We identify Domain Gravity, a representational asymmetry where data imbalance across heterogeneous domains causes overrepresented or low-entropy domains to disproportionately influence the embedding space, leading to prototype drift and degraded performance on underrepresented or high-entropy domains. To address this, we introduce Cross-Discipline Variable Few-Shot Class-Incremental Learning (XD-VSCIL), a benchmark capturing real-world heterogeneity and imbalance where Domain Gravity naturally intensifies. We propose Hybrid Prototype Calibration (HyCal), a training-free method combining cosine similarity and Mahalanobis distance to capture complementary geometric properties-directional alignment and covariance-aware magnitude-yielding stable prototypes under imbalanced heterogeneous conditions. Operating on frozen CLIP embeddings, HyCal achieves consistent retention-adaptation improvements while maintaining efficiency. Experiments show HyCal effectively mitigates Domain Gravity and outperforms existing methods in imbalanced cross-domain incremental learning.