Transfer learning for nonparametric Bayesian networks

arXiv cs.LG / 4/2/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper proposes two transfer-learning methods—PCS-TL (constraint-based) and HC-TL (score-based)—for learning nonparametric Bayesian networks when data are scarce.
  • It introduces specific metrics and experimental procedures to detect and mitigate negative transfer, where transferring knowledge worsens model performance.
  • For parameter estimation, it presents a log-linear pooling approach to combine information effectively across tasks/domains.
  • The evaluation learns kernel density estimation Bayesian networks and compares transfer-learning variants against non-transfer baselines using synthetic networks and UCI datasets, including noise and dataset modifications.
  • Statistical testing (Friedman test with Bergmann-Hommel post-hoc) is used to provide evidence that the proposed methods yield improved experimental behavior while reducing negative transfer risk.

Abstract

This paper introduces two transfer learning methodologies for estimating nonparametric Bayesian networks under scarce data. We propose two algorithms, a constraint-based structure learning method, called PC-stable-transfer learning (PCS-TL), and a score-based method, called hill climbing transfer learning (HC-TL). We also define particular metrics to tackle the negative transfer problem in each of them, a situation in which transfer learning has a negative impact on the model's performance. Then, for the parameters, we propose a log-linear pooling approach. For the evaluation, we learn kernel density estimation Bayesian networks, a type of nonparametric Bayesian network, and compare their transfer learning performance with the models alone. To do so, we sample data from small, medium and large-sized synthetic networks and datasets from the UCI Machine Learning repository. Then, we add noise and modifications to these datasets to test their ability to avoid negative transfer. To conclude, we perform a Friedman test with a Bergmann-Hommel post-hoc analysis to show statistical proof of the enhanced experimental behavior of our methods. Thus, PCS-TL and HC-TL demonstrate to be reliable algorithms for improving the learning performance of a nonparametric Bayesian network with scarce data, which in real industrial environments implies a reduction in the required time to deploy the network.

Transfer learning for nonparametric Bayesian networks | AI Navigate