Avoiding Over-smoothing in Social Media Rumor Detection with Pre-trained Propagation Tree Transformer
arXiv cs.CL / 3/25/2026
💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- The paper argues that common rumor-detection approaches using graph neural networks (GNNs) degrade due to over-smoothing, which is closely linked to the structural properties of rumor propagation trees (with many 1-level nodes).
- It also finds that GNN-based models have difficulty capturing long-range dependencies along reply propagation trees.
- To address both issues, the authors propose P2T3, a pure Transformer-based method that extracts conversation chains from propagation trees and uses token-wise embeddings plus tailored inductive bias to encode connection structure.
- P2T3 is pre-trained on large-scale unlabeled data and then evaluated against prior state-of-the-art methods, showing improved performance across multiple benchmarks and in few-shot settings.
- The work suggests the approach can serve as a foundation for future large-model or unified multi-modal social-media rumor research.
Related Articles
Santa Augmentcode Intent Ep.6
Dev.to

Your Agent Hired Another Agent. The Output Was Garbage. The Money's Gone.
Dev.to
ClawRouter vs TeamoRouter: one requires a crypto wallet, one doesn't
Dev.to
Big Tech firms are accelerating AI investments and integration, while regulators and companies focus on safety and responsible adoption.
Dev.to

Palantir’s billionaire CEO says only two kinds of people will succeed in the AI era: trade workers — ‘or you’re neurodivergent’
Reddit r/artificial