Integrating Meta-Features with Knowledge Graph Embeddings for Meta-Learning

arXiv cs.LG / 3/23/2026

📰 NewsIdeas & Deep AnalysisTools & Practical UsageModels & Research

Key Points

  • KGmetaSP introduces a knowledge-graph-embeddings approach that uses existing experiment data to capture dataset-pipeline interactions for meta-learning tasks PPE and DPSE.
  • It represents datasets and pipelines in a unified knowledge graph to support pipeline-agnostic PPE and distance-based retrieval for DPSE.
  • The authors validate the approach on a large-scale benchmark comprising 144,177 OpenML experiments, enabling rich cross-dataset evaluation.
  • KGmetaSP enables accurate PPE with a single pipeline-agnostic meta-model and improves DPSE performance over baselines.
  • The authors release KGmetaSP, the knowledge graph, and the benchmark, establishing a new reference point for meta-learning in the field.

Abstract

The vast collection of machine learning records available on the web presents a significant opportunity for meta-learning, where past experiments are leveraged to improve performance. Two crucial meta-learning tasks are pipeline performance estimation (PPE), which predicts pipeline performance on target datasets, and dataset performance-based similarity estimation (DPSE), which identifies datasets with similar performance patterns. Existing approaches primarily rely on dataset meta-features (e.g., number of instances, class entropy, etc.) to represent datasets numerically and approximate these meta-learning tasks. However, these approaches often overlook the wealth of past experimental results and pipeline metadata available. This limits their ability to capture dataset - pipeline interactions that reveal performance similarity patterns. In this work, we propose KGmetaSP, a knowledge-graph-embeddings approach that leverages existing experiment data to capture these interactions and improve both PPE and DPSE. We represent datasets and pipelines within a unified knowledge graph (KG) and derive embeddings that support pipeline-agnostic meta-models for PPE and distance-based retrieval for DPSE. To validate our approach, we construct a large-scale benchmark comprising 144,177 OpenML experiments, enabling a rich cross-dataset evaluation. KGmetaSP enables accurate PPE using a single pipeline-agnostic meta-model and improves DPSE over baselines. The proposed KGmetaSP, KG, and benchmark are released, establishing a new reference point for meta-learning and demonstrating how consolidating open experiment data into a unified KG advances the field.