Inter-Stance: A Dyadic Multimodal Corpus for Conversational Stance Analysis

arXiv cs.CV / 4/27/2026

📰 NewsSignals & Early TrendsModels & Research

Key Points

  • The paper announces “Inter-Stance,” a new publicly available dyadic multimodal corpus designed for conversational stance analysis in real social interactions.
  • The dataset covers synchronized multimodal signals from 90 participants across 45 dyads, including 2D/3D facial data, thermal dynamics, voice and speech, physiological measures (PPG, EDA, heart rate, blood pressure, respiration), and self-reported affect.
  • It includes two dyad types—pairs with shared past history and strangers—and provides annotations for social signals as well as stance categories such as agreement, disagreement, and neutral.
  • The study includes experiments evaluating how multimodal dyadic communication and affect differ between dyads with and without interpersonal history.
  • The release includes 20TB of data intended to enable new multimodal modeling of interpersonal behavior for the research community.

Abstract

Social interactions dominate our perceptions of the world and shape our daily behavior by attaching social meaning to acts as simple and spontaneous as gestures, facial expressions, voice, and speech. People mimic and otherwise respond to each other's postures, facial expressions, mannerisms, and other verbal and nonverbal behavior, and form appraisals or evaluations in the process. Yet, no publicly-available dataset includes multimodal recordings and self-report measures of multiple persons in social interaction. Dyadic recordings and annotation are lacking. We present a new data corpus of multimodal dyadic interaction (45 dyads, 90 persons) that includes synchronized multi-modality behavior (2D face video, 3D face geometry, thermal spectrum dynamics, voice and speech behavior, physiology (PPG, EDA, heart-rate, blood pressure, and respiration), and self-reported affect of all participants in a communicative interaction scenario. Two types of dyads are included: persons with shared past history and strangers. Annotations include social signals, agreement, disagreement, and neutral stance. With a potent emotion induction, these multimodal data will enable novel modeling of multimodal interpersonal behavior. We present extensive experiments to evaluate multimodal dyadic communication of dyads with and without interpersonal history, and their affect. This new database will make multimodal modeling of social interaction never possible before. The dataset includes 20TB of multimodal data to share with the research community.

Inter-Stance: A Dyadic Multimodal Corpus for Conversational Stance Analysis | AI Navigate