Your Robot Will Feel You Now: Empathy in Robots and Embodied Agents

arXiv cs.AI / 2026/3/24

💬 オピニオンSignals & Early TrendsIdeas & Deep AnalysisModels & Research

要点

  • The article reviews research in human-robot interaction and embodied conversational agents on how empathy can be implemented in machines through multimodal social and emotional behaviors.
  • It examines what empathic behaviors and models have been implemented by mimicking human and animal cues, including facial expressions, gestures, and speech.
  • The review discusses how researchers have explored machine-specific analogies for empathy rather than relying solely on human-like imitation.
  • It aims to translate these HRI/ECAs lessons to today’s predominantly language-based agents (e.g., ChatGPT), suggesting a path for adding embodied or empathetic capabilities to LLM-era systems.

Abstract

The fields of human-robot interaction (HRI) and embodied conversational agents (ECAs) have long studied how empathy could be implemented in machines. One of the major drivers has been the goal of giving multimodal social and emotional intelligence to these artificially intelligent agents, which interact with people through facial expressions, body, gesture, and speech. What empathic behaviors and models have these fields implemented by mimicking human and animal behavior? In what ways have they explored creating machine-specific analogies? This chapter aims to review the knowledge from these studies, towards applying the lessons learned to today's ubiquitous, language-based agents such as ChatGPT.