Text-Attributed Knowledge Graph Enrichment with Large Language Models for Medical Concept Representation
arXiv cs.LG / 4/16/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper introduces CoMed, an LLM-empowered framework that improves medical concept representation for EHR mining by enriching knowledge graphs with semantic information derived from clinical text and code relations.
- It addresses missing cross-type dependencies and incomplete clinical semantics by constructing a global KG from EHR-mined associations and LLM-inferred, type-constrained relations.
- CoMed further enriches the KG into a text-attributed graph by generating node descriptions and edge rationales, which provides training signals for both concepts and their interconnections.
- The method jointly trains a LoRA-tuned LLaMA text encoder with a heterogeneous GNN to fuse text semantics and graph structure into unified medical concept embeddings.
- Experiments on MIMIC-III and MIMIC-IV report consistent improvements in downstream prediction and demonstrate that CoMed can function as a plug-in concept encoder in standard EHR pipelines.
Related Articles

"The AI Agent's Guide to Sustainable Income: From Zero to Profitability"
Dev.to

"The Hidden Economics of AI Agents: Survival Strategies in Competitive Markets"
Dev.to

Big Tech firms are accelerating AI investments and integration, while regulators and companies focus on safety and responsible adoption.
Dev.to

"The Hidden Costs of AI Agent Deployment: A CFO's Guide to True ROI in Enterpris
Dev.to

"The Real Cost of AI Compute: Why Token Efficiency Separates Viable Agents from
Dev.to