What Does a Meow Mean? In Search of Intuitively Understandable Communication by a Nonverbal Companion Robot

arXiv cs.RO / 5/5/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The study aims to improve communication between older adults and a nonverbal “cat robot” by using intuitive auditory (cat sounds) and visual (display icons) signals.
  • Researchers created an initial set of communication cues for limited-assistive functions and refined it based on feedback from a pilot study and a focus group of older adults.
  • A large online experiment tested whether adults aged 65+ could correctly infer the robot’s communicative intentions, finding that combined visual+auditory cues produced the highest accuracy.
  • Accuracy dropped when visual cues were missing, and the effect of losing auditory cues was mixed—auditory signals helped mainly when the robot conveyed strong emotions such as purring during petting.

Abstract

Older adults living alone have a number of challenges, and robots can help with some of them--by providing reminders, initiating activity, or offering comfort. As part of developing a cat robot with limited assistive functions, we designed a set of nonverbal communication signals, both auditory (cat sounds) and visual (icons on a small display). To evaluate these signals we used a mixed-methods, user-centered approach. After a pilot study, a focus group with older adults suggested revisions to the initial signal set. A large-sample online experiment then tested whether adults over the age of 65 could accurately infer the robot's communicative intentions. When both visual and auditory signals were present, accuracy was high. When visual signals were absent, accuracy often decreased; when auditory signals were absent, accuracy sometimes increased. So the auditory signals were less helpful, except when the robot conveyed strong sentiments (e.g., purring while being petted).