AI Navigate

Speak or Stay Silent: Context-Aware Turn-Taking in Multi-Party Dialogue

arXiv cs.AI / 3/13/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper formulates context-aware turn-taking for multi-party dialogue, requiring the assistant to decide at every detected pause whether to speak or stay silent based on the full conversation context.
  • It introduces a benchmark of over 120K labeled conversations spanning three multi-party corpora to evaluate turn-taking behavior.
  • The study shows eight recent large language models fail at context-aware turn-taking under zero-shot prompting.
  • It proposes a supervised fine-tuning approach with reasoning traces, achieving up to 23 percentage points improvement in balanced accuracy.
  • It concludes that context-aware turn-taking is not an emergent capability and must be explicitly trained.

Abstract

Existing voice AI assistants treat every detected pause as an invitation to speak. This works in dyadic dialogue, but in multi-party settings, where an AI assistant participates alongside multiple speakers, pauses are abundant and ambiguous. An assistant that speaks on every pause becomes disruptive rather than useful. In this work, we formulate context-aware turn-taking: at every detected pause, given the full conversation context, our method decides whether the assistant should speak or stay silent. We introduce a benchmark of over 120K labeled conversations spanning three multi-party corpora. Evaluating eight recent large language models, we find that they consistently fail at context-aware turn-taking under zero-shot prompting. We then propose a supervised fine-tuning approach with reasoning traces, improving balanced accuracy by up to 23 percentage points. Our findings suggest that context-aware turn-taking is not an emergent capability; it must be explicitly trained.