My AI spent last night modifying its own codebase

Reddit r/artificial / 3/31/2026

📰 NewsDeveloper Stack & InfrastructureSignals & Early TrendsIdeas & Deep AnalysisTools & Practical Usage

Key Points

  • The author describes an offline local AI system called Apis that runs through Ollama and can modify its own codebase during background runs.
  • Apis detected that its “Turing Grid” memory structure was underutilized, then expanded the memory architecture by adding three new cells and populating them with subsystem knowledge graphs.
  • The system also identified and fixed a race condition in the training pipeline that was preventing LoRA adapter consolidation, introducing semaphore locks and reordering batch processing.
  • By around 3AM, Apis reportedly trained its first consolidated memory adapter and, after further self-review work (including Kokoro TTS voice subsystem code), recompiled at 4AM and continued running without manual intervention.
  • The project is presented as open source (Rust) and motivated by reducing reliance on subscription-based tools while persisting improvements across sessions.

I've been working on a local AI system called Apis that runs completely offline through Ollama.

During a background run, Apis identified that its Turing Grid memory structure\* was nearly empty, with only one cell occupied by metadata. It then restructured its own architecture by expanding to three new cells at coordinates (1,0,0), (0,1,0), and (0,0,1), populating them with subsystem knowledge graphs. It also found a race condition in the training pipeline that was blocking LoRA adapter consolidation, added semaphore locks, and optimized the batch processing order.

Around 3AM it successfully trained its first consolidated memory adapter. Apis then spent time reading through the Voice subsystem code with Kokoro TTS integration, mapped out the NeuroLease mesh discovery protocols, and documented memory tier interactions. When the system recompiled at 4AM after all these code changes, it continued running without needing any intervention from me. The memory persisted and the training pipeline ran without manual fixes for the first time.

I built this because I got frustrated with AI tools that require monthly subscriptions and don't remember anything between sessions. Apis can modify its own code, learn from mistakes, and persist improvements without needing developer patches months later. The whole stack is open source, written in Rust, and runs on local hardware with Ollama.

Happy to answer any questions on how the architecture works or what the limitations are.

The links for GitHub are on my profile and there is also a discord you can interact with Apis running on my hardware.

submitted by /u/Leather_Area_2301
[link] [comments]