[N] Just found out that Milla Jovovich is a dev, invested in AI, and just open sourced a project

Reddit r/MachineLearning / 4/8/2026

📰 NewsDeveloper Stack & InfrastructureSignals & Early TrendsTools & Practical UsageModels & Research

Key Points

  • Milla Jovovich has open-sourced MemPalace, an AI memory system designed to retain and retrieve past conversations with far higher benchmark performance than paid competitors.
  • Instead of asking the AI to select what to remember, MemPalace stores interactions broadly and makes them findable using a “memory palace” data architecture with wings, halls, and rooms plus cross-references.
  • The project reportedly improves retrieval by 34% using the palace structure alone on a dataset of 22,000+ conversation memories, and includes a compression technique (AAAK) that compresses text ~30x with no information loss.
  • MemPalace achieves 96.6% recall on the LongMemEval benchmark without API calls by running locally (SQLite and ChromaDB), making it free to use and subscription- and key-free.
  • The MIT-licensed system is built to work with major LLMs that can read text (Claude, GPT, Gemini, Llama, Mistral, etc.), with implementation driven by co-creator Ben Sigman.

As a dev and RE fan myself I am kind of amazed, A WOMAN, HAVING A PASSION ABOUT AI, AND IS RESIDENT EVIL RELATED


Following text body source

Milla Jovovich — yes, the actress from Resident Evil and The Fifth Element — just released an open-source AI memory system that scored higher on standard retrieval benchmarks than every paid competitor on the market. It's called MemPalace, it's free, and it hit 1,100 GitHub stars within days.

The problem it solves is one every heavy AI user knows: every conversation with Claude, ChatGPT, or Copilot disappears when the session ends. Six months of daily use generates roughly 19.5 million tokens — decisions, debugging sessions, architecture debates — all gone. Existing memory tools try to fix this by letting AI decide what's worth keeping. MemPalace takes the opposite approach: store everything, then make it findable through structure.

That structure is borrowed from the ancient Greek method of loci — the memory palace technique — applied as actual data architecture. Conversations are organised into wings (people, projects), halls (types of memory), and rooms (specific topics), with cross-references linking the same topic across different domains. On 22,000+ real conversation memories, the palace structure alone improved retrieval by 34%. The system also includes a compression dialect called AAAK that shrinks natural language by roughly 30x with zero information loss, so an AI can load months of context in about 120 tokens.

The headline number: 96.6% recall on the LongMemEval benchmark with no API calls at all, and 100% with a lightweight reranking step. For comparison, Mem0 scores around 85% and costs $19–249/month. Zep scores similarly at $25/month+. MemPalace costs nothing — it runs entirely locally on SQLite and ChromaDB, no cloud, no subscriptions, no API keys.

Jovovich described the development as months of failed blueprints across nearly 1,000 documents before the architecture clicked. Her co-creator Ben Sigman then engineered the implementation. It works with Claude, GPT, Gemini, Llama, Mistral — any model that reads text. MIT-licensed, fully open-source.

Repo)

submitted by /u/_giga_sss_
[link] [comments]