TurboMemory: Claude-style long-term memory with 4-bit/6-bit embeddings (runs locally) – looking for contributors

Reddit r/LocalLLaMA / 4/2/2026

💬 OpinionDeveloper Stack & InfrastructureSignals & Early TrendsTools & Practical Usage

Key Points

  • TurboMemory is a locally runnable, Claude-style long-term memory system for AI agents/chatbots that stores semantic memory using TurboQuant-style embedding compression at 4-bit/6-bit/8-bit precision.
  • The approach combines compressed packed embeddings with a SQLite index for fast retrieval, plus topic-centroid prefiltering to reduce search cost.
  • It includes ongoing memory management via a daemon that consolidates (merges/prunes) stored memories automatically, along with contradiction detection and confidence decay to keep memories up to date.
  • The project is seeking early contributors, especially people with Python and systems/ML skills, with suggested first issues covering benchmarks, packaging, retrieval/scoring improvements, and testing.
  • Feedback is invited from builders of AI agents on what features are missing from the current design, indicating an early-stage community development effort.
TurboMemory: Claude-style long-term memory with 4-bit/6-bit embeddings (runs locally) – looking for contributors

Hey all,

I’m building TurboMemory — a local long-term memory system for AI agents / chatbots.

Main idea:

store semantic memory using TurboQuant-style compression

4-bit / 6-bit / 8-bit packed embeddings

SQLite index for fast lookup

topic centroid prefilter to reduce search cost

daemon consolidation (merge/prune old memory automatically)

contradiction detection + confidence decay

It’s meant to be a lightweight “Claude-style memory” that runs on your laptop.

Repo: https://github.com/Kubenew/TurboMemory⁠�

I’m looking for early contributors (Python + systems/ML folks).

Good first issues: benchmarks, packaging, improving retrieval/scoring, tests.

If you build agents, I’d love feedback: what features are missing?

submitted by /u/Hopeful-Priority1301
[link] [comments]