AI Navigate

Newest GPU server in the lab! 72gb ampere vram!

Reddit r/LocalLLaMA / 3/19/2026

📰 NewsDeveloper Stack & InfrastructureModels & Research

Key Points

  • A new GPU server with 72 GB Ampere VRAM was built in the lab to support large AI models.
  • It is reportedly running gptoss 120b at 90t/s and qwen 3.5 35b a3b at 80 t/s.
  • The node serves as the host for an RPC mesh with two 64 GB Orin development kits.
  • The post was submitted by /u/braydon125 on Reddit's LocalLLaMA and links to a video.
Newest GPU server in the lab! 72gb ampere vram!

Built this beautiful monstrosity to satisfy my mental illness. Running gptoss 120b at 90t/s, qwen 3.5 35b a3b at 80 t/s.

This node is running host for my RPC mesh with the two 64gb orin dev kits

submitted by /u/braydon125
[link] [comments]