We built a way for two people's AI context to talk to each other (without sharing their conversations)

Reddit r/artificial / 5/7/2026

💬 OpinionIdeas & Deep AnalysisTools & Practical Usage

Key Points

  • The authors discuss how using AI/LLMs in relationships can create “bubbles,” because each person’s AI builds a picture based only on that person’s perspective.
  • They propose enabling two people in the same relationship to have AI context interact without exchanging their actual conversation logs.
  • They built a system where one person never sees what the other said, yet both sides’ separately learned insights are used to provide each person a less one-sided view.
  • While the approach may not fully solve the problem, the authors believe it’s a worthwhile experiment and invite others to comment on whether this “bubble” effect is common.

We've been thinking about how we use AI in our relationships. Big part of it is about other people. Talking about them, figuring out what to say to them, understanding why they did this and that. So AI or LLMs build up this picture of the people in our lives but just from our perspective. Every user is just... in their own bubble.

We started wondering what happens if both people in a relationship are using AI to process the same dynamic independently. You've got two separate, privately-held pictures of the same relationship sitting in two different chat windows and they never talk to each other.

So we built something where they can. Not by sharing your conversations (the other person never sees what you said.) It just uses what it learned from both sides separately to give each person a less one-sided picture.

Probably not fully solved but felt worth building. Anyone else noticed the bubble thing?

submitted by /u/Standard-While-2454
[link] [comments]