Just got a beast.

Reddit r/LocalLLaMA / 4/28/2026

💬 OpinionSignals & Early TrendsTools & Practical Usage

Key Points

  • A Reddit user says they recently acquired a high-end Mac Pro 2019 configured with 1.5TB RAM, 128GB VRAM, and a 28-core processor.
  • They are asking the community what AI models to run and benchmark on this machine.
  • The user specifically wants to test whether GLM 5.2 can run effectively with the “experts” offloaded to VRAM, though results are uncertain.
  • The post is framed as a request for practical evaluation rather than an established announcement, launch, or documented benchmark outcome yet.

1.5 tb ram with 128gb vram and a 28 core processor. Mac Pro 2019. What models do you guys want me to run and benchmark? I’m hoping it can handle glm 5.2 with the experts offloaded to vram but we will see !

submitted by /u/habachilles
[link] [comments]