Drift-Aware Online Dynamic Learning for Nonstationary Multivariate Time Series: Application to Sintering Quality Prediction

arXiv cs.LG / 4/13/2026

💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper addresses degraded performance in offline-trained models for nonstationary multivariate time series, highlighting problems from concept drift and delayed label verification in industrial settings like iron ore sintering.
  • It proposes a Drift-Aware Multi-Scale Dynamic Learning (DA-MSDL) framework that uses a multi-scale bi-branch convolutional network to capture both local fluctuations and long-term trends for improved multi-output prediction.
  • To handle label latency, DA-MSDL performs unsupervised drift detection using Maximum Mean Discrepancy (MMD), triggering online adaptation before relying on new supervision.
  • It introduces drift-severity-guided hierarchical fine-tuning with prioritized experience replay to rapidly align to changing data distributions while reducing catastrophic forgetting.
  • Experiments on real industrial sintering data and a public benchmark show DA-MSDL outperforming baselines under severe drift and maintaining stronger stability and cross-domain generalization.

Abstract

Accurate prediction of nonstationary multivariate time series remains a critical challenge in complex industrial systems such as iron ore sintering. In practice, pronounced concept drift compounded by significant label verification latency rapidly degrades the performance of offline-trained models. Existing methods based on static architectures or passive update strategies struggle to simultaneously extract multi-scale spatiotemporal features and overcome the stability-plasticity dilemma without immediate supervision. To address these limitations, a Drift-Aware Multi-Scale Dynamic Learning (DA-MSDL) framework is proposed to maintain robust multi-output predictive performance via online adaptive mechanisms on nonstationary data streams. The framework employs a multi-scale bi-branch convolutional network as its backbone to disentangle local fluctuations from long-term trends, thereby enhancing representational capacity for complex dynamic patterns. To circumvent the label latency bottleneck, DA-MSDL leverages Maximum Mean Discrepancy (MMD) for unsupervised drift detection. By quantifying online statistical deviations in feature distributions, DA-MSDL proactively triggers model adaptation prior to inference. Furthermore, a drift-severity-guided hierarchical fine-tuning strategy is developed. Supported by prioritized experience replay from a dynamic memory queue, this approach achieves rapid distribution alignment while effectively mitigating catastrophic forgetting. Long-horizon experiments on real-world industrial sintering data and a public benchmark dataset demonstrate that DA-MSDL consistently outperforms representative baselines under severe concept drift. Exhibiting strong cross-domain generalization and predictive stability, the proposed framework provides an effective online dynamic learning paradigm for quality monitoring in nonstationary environments.