Dual-Criterion Curriculum Learning: Application to Temporal Data

arXiv cs.LG / 3/26/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper proposes Dual-Criterion Curriculum Learning (DCCL), a curriculum-learning framework that defines instance-wise difficulty using both a loss-based measure and a density-based measure in the learned representation space.
  • DCCL aims to calibrate training evidence (loss) by explicitly accounting for data sparsity, which the authors argue increases learning difficulty for certain instances.
  • The approach is evaluated on multivariate time-series forecasting benchmarks using standard curriculum schedules (One-Pass and Baby-Steps).
  • Experimental results indicate that density-based and hybrid dual-criterion curricula outperform loss-only baselines and conventional non-curriculum training in this time-series setting.

Abstract

Curriculum Learning (CL) is a meta-learning paradigm that trains a model by feeding the data instances incrementally according to a schedule, which is based on difficulty progression. Defining meaningful difficulty assessment measures is crucial and most usually the main bottleneck for effective learning, while also in many cases the employed heuristics are only application-specific. In this work, we propose the Dual-Criterion Curriculum Learning (DCCL) framework that combines two views of assessing instance-wise difficulty: a loss-based criterion is complemented by a density-based criterion learned in the data representation space. Essentially, DCCL calibrates training-based evidence (loss) under the consideration that data sparseness amplifies the learning difficulty. As a testbed, we choose the time-series forecasting task. We evaluate our framework on multivariate time-series benchmarks under standard One-Pass and Baby-Steps training schedules. Empirical results show the interest of density-based and hybrid dual-criterion curricula over loss-only baselines and standard non-CL training in this setting.