From XAI to MLOps: Explainable Concept Drift Detection with Profile Drift Detection

arXiv stat.ML / 4/7/2026

💬 OpinionIdeas & Deep AnalysisTools & Practical UsageModels & Research

Key Points

  • Predictive models can lose accuracy as data distributions evolve (data drift), and concept drift is especially hard to detect because the underlying relationship changes rather than just marginal inputs.
  • The paper introduces Profile Drift Detection (PDD), a concept-drift detection method that uses explainable AI artifacts—specifically Partial Dependence Profiles (PDPs)—to capture meaningful changes in model behavior.
  • PDD defines new, computationally efficient drift metrics that quantify how PDPs change over the data stream, improving sensitivity to conceptual shifts compared with traditional accuracy- or marginal-distribution-based signals.
  • Experiments on synthetic and real-world datasets show PDD outperforms existing approaches while balancing sensitivity versus stability of drift alerts and maintaining high predictive performance.
  • The authors position PDD as compatible with MLOps workflows, supporting continuous monitoring and adaptive retraining for real-time deployments in dynamic environments.

Abstract

Predictive models often degrade in performance due to evolving data distributions, a phenomenon known as data drift. Among its forms, concept drift, where the relationship between explanatory variables and the response variable changes, is particularly challenging to detect and adapt to. Traditional drift detection methods often rely on metrics such as accuracy or marginal variable distributions, which may fail to capture subtle but important conceptual changes. This paper proposes a novel method, Profile Drift Detection (PDD), which enables both the detection of concept drift and an enhanced understanding of its underlying causes by leveraging an explainable AI tool: Partial Dependence Profiles (PDPs). PDD quantifies changes in PDPs through new drift metrics that are sensitive to shifts in the data stream while remaining computationally efficient. This approach is aligned with MLOps practices, emphasizing continuous model monitoring and adaptive retraining in dynamic environments. Experiments on synthetic and real-world datasets demonstrate that PDD outperforms existing methods by maintaining high predictive performance while effectively balancing sensitivity and stability in drift signals. The results highlight its suitability for real-time applications, and the paper concludes by discussing the method's advantages, limitations, and potential extensions to broader use cases.