LLM-Enhanced Energy Contrastive Learning for Out-of-Distribution Detection in Text-Attributed Graphs
arXiv cs.AI / 2026/3/24
📰 ニュースIdeas & Deep AnalysisModels & Research
要点
- The paper targets node-level out-of-distribution (OOD) detection in text-attributed graphs, where training/test distribution mismatch can severely degrade node classification performance.
- It introduces LECT (LLM-Enhanced Energy Contrastive Learning), combining large language models (LLMs) with energy-based contrastive learning to separate in-distribution (IND) from OOD nodes.
- LECT generates dependency-aware pseudo-OOD samples using LLM semantic understanding and contextual knowledge, enabling higher-quality OOD augmentation.
- Experiments across six benchmark datasets show LECT consistently outperforms existing state-of-the-art baselines while maintaining both high classification accuracy and robust OOD detection.

