Federated Personal Knowledge Graph Completion with Lightweight Large Language Models for Personalized Recommendations
arXiv cs.LG / 3/17/2026
📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- Introduces FedTREK-LM, a framework that unifies lightweight large language models, evolving personal knowledge graphs, federated learning, and Kahneman-Tversky Optimization to enable scalable, decentralized personalization.
- Demonstrates context-aware reasoning by prompting LLMs with structured PKGs for personalized recommendations, including movie and recipe suggestions, evaluated on three Qwen3 models (0.6B, 1.7B, 4B).
- Reports more than a 4x improvement in F1-score over state-of-the-art baselines (HAKE, KBGAT, FedKGRec) on movie and food benchmarks, showing strong performance gains.
- Finds that real user data is critical for effective personalization, with synthetic data degrading performance by up to 46%, underscoring privacy-preserving yet data-dependent limitations.
- Suggests the approach generalizes across decentralized, evolving user PKGs, offering a practical paradigm for adaptive, LLM-powered personalization.
Related Articles

Astral to Join OpenAI
Dev.to

PearlOS. We gave swarm intelligence a local desktop environment and code control to self-evolve. Has been pretty incredible to see so far. Open source and free if you want your own.
Reddit r/LocalLLaMA

Why Data is Important for LLM
Dev.to

Waymo hits 170 million miles while avoiding serious mayhem
The Verge

The Inference Market Is Consolidating. Agent Payments Are Still Nobody's Problem.
Dev.to