Evaluation of Bagging Predictors with Kernel Density Estimation and Bagging Score

arXiv cs.LG / 4/7/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper studies bagging predictors, where an ensemble prediction is commonly formed by taking the mean (or median) across models, but this can be inaccurate in certain parameter regions.
  • It proposes selecting a representative ensemble target y_BS using Kernel Density Estimation (KDE) within a nonlinear regression framework that leverages neural networks.
  • The method also outputs a confidence/quality metric called the Bagging Score (β_BS) to quantify reliability of the ensemble prediction.
  • Experiments indicate the KDE-based bagging approach yields better predictive performance than standard mean/median aggregation, across multiple error measures.
  • The approach is benchmarked against several nonlinear regression alternatives from the literature and achieves top rankings in the reported error metrics without using optimization or feature selection techniques.

Abstract

For a larger set of predictions of several differently trained machine learning models, known as bagging predictors, the mean of all predictions is taken by default. Nevertheless, this proceeding can deviate from the actual ground truth in certain parameter regions. An approach is presented to determine a representative y_BS from such a set of predictions using Kernel Density Estimation (KDE) in nonlinear regression with Neural Networks (NN) which simultaneously provides an associated quality criterion beta_BS, called Bagging Score (BS), that reflects the confidence of the obtained ensemble prediction. It is shown that working with the new approach better predictions can be made than working with the common use of mean or median. In addition to this, the used method is contrasted to several approaches of nonlinear regression from the literatur, resulting in a top ranking in each of the calculated error values without using any optimization or feature selection technique.