New build

Reddit r/LocalLLaMA / 4/1/2026

💬 OpinionDeveloper Stack & InfrastructureSignals & Early TrendsTools & Practical Usage

Key Points

  • A Reddit user shared a new LocalLLaMA build featuring a 32GB×4 ECC DDR5 setup, an Intel i9-13900K, multiple SSDs, and two RTX 6000 Blackwell Max-Q GPUs.
  • The build notes a PCIe lane limitation that causes both GPUs to run at x8 PCIe 5.0 instead of x16, which raises performance and scaling questions.
  • The user is considering upgrading the CPU and increasing RAM channels when DDR5 prices drop to potentially improve overall throughput and GPU handling.
  • They ask whether the CPU/RAM-channel changes would meaningfully affect performance in this dual-GPU configuration for local AI workloads.
New build

Seasonic 1600w titanium power supply

Supermicro X13SAE-F

Intel i9-13900k

4x 32GB micron ECC udimms

3x intel 660p 2TB m2 ssd

2x micron 9300 15.36TB u2 ssd (not pictured)

2x RTX 6000 Blackwell max-q

Due to lack of pci lanes gpus are running at x8 pci 5.0

I may upgrade to a better cpu to handle both cards at x16 once ddr5 ram prices go down.

Would upgrading cpu and increasing ram channels matter really that much?

submitted by /u/Annual_Award1260
[link] [comments]