Hey,
I’m looking to upgrade my hardware for local LLM use. I’m not quite sure yet which solution to go with.
My budget is around €6,500.
I’m considering buying a MacBook Pro M5 Max with 128 GB of unified memory. From what I’ve heard, that seems to be the best solution for loading the largest models (text processing; for images, my 4090 is probably still the better choice?).
Power consumption should be significantly lower than if I were to cobble together some kind of dual-GPU rig, which might be overkill for text processing in the long run (besides I am running out of space on my desk lol)?
I’ve also heard of systems like the Acemagic M1A Pro+ or the Beelink GTR9 Pro AMD Ryzen AI Max+ 395. With my budget, I could almost buy two of those lol. But these things are probably even louder, right?
Do you guys have any suggestions? Which option is more future-proof? Which one will give me better performance (MLX on Mac or GGUF with AMD?)
My primary use case would be to have AI handle boilerplate programming (Qwen Coder Next or Gemma4 or whatever other models might pop up in the future).
What other options have I overlooked? Buying four 3090 (used) for a quad setup?
[link] [comments]




