Adaptive Kernel Selection for Kernelized Diffusion Maps
arXiv stat.ML / 4/21/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper addresses kernel selection as a core challenge in kernel-based spectral methods, showing how the choice of kernel in Kernelized Diffusion Maps (KDM) affects RKHS estimation accuracy and the stability/quality of recovered eigenfunctions.
- It proposes two complementary adaptive strategies: a variational outer loop that learns continuous kernel parameters via differentiating through a Cholesky-reduced KDM eigenproblem using an objective based on eigenvalue maximization, subspace orthonormality, and RKHS regularization.
- It also introduces an unsupervised cross-validation pipeline that selects kernel families and bandwidths using an eigenvalue-sum criterion, enhanced with random Fourier features for scalability.
- The work includes a unified theoretical foundation proving Lipschitz dependence of KDM operators on kernel weights, continuity of spectral projectors under a spectral-gap condition, a residual-control theorem guaranteeing closeness to the target eigenspace, and exponential consistency of the cross-validation selector over a finite kernel dictionary.
Related Articles

Every time a new model comes out, the old one is obsolete of course
Reddit r/LocalLLaMA

We built it during the NVIDIA DGX Spark Full-Stack AI Hackathon — and it ended up winning 1st place overall 🏆
Dev.to

Stop Losing Progress: Setting Up a Pro Jupyter Workflow in VS Code (No More Colab Timeouts!)
Dev.to

Building AgentOS: Why I’m Building the AWS Lambda for Insurance Claims
Dev.to

Where we are. In a year, everything has changed. Kimi - Minimax - Qwen - Gemma - GLM
Reddit r/LocalLLaMA