Inter-Stance: A Dyadic Multimodal Corpus for Conversational Stance Analysis
arXiv cs.CV / 4/27/2026
📰 NewsSignals & Early TrendsModels & Research
Key Points
- The paper announces “Inter-Stance,” a new publicly available dyadic multimodal corpus designed for conversational stance analysis in real social interactions.
- The dataset covers synchronized multimodal signals from 90 participants across 45 dyads, including 2D/3D facial data, thermal dynamics, voice and speech, physiological measures (PPG, EDA, heart rate, blood pressure, respiration), and self-reported affect.
- It includes two dyad types—pairs with shared past history and strangers—and provides annotations for social signals as well as stance categories such as agreement, disagreement, and neutral.
- The study includes experiments evaluating how multimodal dyadic communication and affect differ between dyads with and without interpersonal history.
- The release includes 20TB of data intended to enable new multimodal modeling of interpersonal behavior for the research community.
Related Articles

Subagents: The Building Block of Agentic AI
Dev.to

GET Serves Cache, POST Runs Inference: Cost Safety for a Public LLM Endpoint
Dev.to

DeepSeek-V4 Models Could Change Global AI Race
AI Business

Got OpenAI's privacy filter model running on-device via ExecuTorch
Reddit r/LocalLLaMA

The Agent-Skill Illusion: Why Prompt-Based Control Fails in Multi-Agent Business Consulting Systems
Dev.to