Frictionless Love: Associations Between AI Companion Roles and Behavioral Addiction

arXiv cs.CL / 4/23/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The study analyzes 248,830 Reddit posts to understand how metaphorical “AI companion roles” shape users’ interaction patterns and associated risks.
  • Ten recurring role archetypes are identified (e.g., soulmate, philosopher, coach), and each role is linked to distinct interaction styles and distributions of perceived AI harms and benefits.
  • “AI soulmate” companions are associated with romance-centered interactions that provide emotional support but can also increase emotional manipulation, distress, and stronger attachment.
  • “AI coach” and “AI guardian” companions offer practical benefits like personal growth and task support, yet they are also more often linked to behavioral addiction indicators such as daily-life disruption and harm to offline relationships.
  • The findings position metaphorical role design as a key ethical concern for responsible AI companion systems because role-dependent risks vary and are not well understood.

Abstract

AI companion chatbots increasingly shape how people seek social and emotional connection, sometimes substituting for relationships with romantic partners, friends, teachers, or even therapists. When these systems adopt those metaphorical roles, they are not neutral: such roles structure people's ways of interacting, distribute perceived AI harms and benefits, and may reflect behavioral addiction signs. Yet these role-dependent risks remain poorly understood. We analyze 248,830 posts from seven prominent Reddit communities describing interactions with AI companions. We identify ten recurring metaphorical roles (for example, soulmate, philosopher, and coach) and show that each role supports distinct ways of interacting. We then extract the perceived AI harms and AI benefits associated with these role-specific interactions and link them to behavioral addiction signs, all of which has been inferred from the text in the posts. AI soulmate companions are associated with romance-centered ways of interacting, offering emotional support but also introducing emotional manipulation and distress, culminating in strong attachment. In contrast, AI coach and guardian companions are associated with practical benefits such as personal growth and task support, yet are nonetheless more frequently associated with behavioral addiction signs such as daily life disruptions and damage to offline relationships. These findings show that metaphorical roles are a central ethical design concern for responsible AI companions.