SCHK-HTC: Sibling Contrastive Learning with Hierarchical Knowledge-Aware Prompt Tuning for Hierarchical Text Classification
arXiv cs.CL / 4/20/2026
📰 NewsModels & Research
Key Points
- The paper addresses few-shot hierarchical text classification (HTC), where texts must be assigned to labels arranged in a tree structure under limited data.
- It argues that existing methods’ reliance on hierarchical consistency constraints is insufficient, especially for separating semantically similar sibling classes when domain knowledge is scarce.
- The proposed method, SCHK-HTC, combines a hierarchical knowledge extraction module with sibling contrastive learning guided by hierarchical knowledge-aware prompt tuning.
- By learning discriminative representations at multiple levels of the label hierarchy, the approach improves the separability of confusing classes.
- Experiments on three benchmark datasets show performance gains over prior state-of-the-art methods in most cases, with accompanying code released on GitHub.
Related Articles

From Theory to Reality: Why Most AI Agent Projects Fail (And How Mine Did Too)
Dev.to

GPT-5.4-Cyber: OpenAI's Game-Changer for AI Security and Defensive AI
Dev.to
Local LLM Beginner’s Guide (Mac - Apple Silicon)
Reddit r/artificial

Is Your Skill Actually Good? Systematically Validating Agent Skills with Evals
Dev.to

Space now with memory
Dev.to