If it works - don’t touch it: COMPETITION

Reddit r/LocalLLaMA / 4/14/2026

💬 OpinionDeveloper Stack & InfrastructureSignals & Early TrendsIdeas & Deep Analysis

Key Points

  • A Reddit post invites people to share unconventional “home inference system” builds and join a friendly competition, emphasizing the principle “If it works—don’t touch it.”
  • The author describes their current setup specs as a local LLM inference system using 4x3090 GPUs, 128GB DDR4 memory, and an 18/36-core configuration.
  • The post suggests ongoing tinkering and potential enclosure/case redesign, but defers changes until later to avoid disrupting a working setup.
  • The main value is community-driven sharing of real-world local inference hardware configurations rather than a formal technical guide.
If it works - don’t touch it: COMPETITION

<3

Come on, share your "weird" home inference system builds. Let's have a little friendly competition. I think I am the absolute leader.

I took the grill from my wife’s oven, and I also found an egg carton.
I will design a new case, but not now.

If it works - don’t touch it.

4x3090, 128GB DDR4, 18/36 Cores

submitted by /u/awfulalexey
[link] [comments]