Best Agentic Coding model I can run on the new Macbook M5 Max?

Reddit r/LocalLLaMA / 5/2/2026

💬 OpinionSignals & Early TrendsTools & Practical UsageModels & Research

Key Points

  • The post asks which “agentic coding” model users can run locally on a new 16-inch MacBook Pro with the Apple M5 Max chipset.
  • It lists key hardware specifications, including an 18-core CPU, a 40-core GPU with hardware-accelerated ray tracing, a 16-core Neural Engine, and 128GB unified memory.
  • The question is framed around practical local deployment constraints—CPU/GPU/Neural Engine resources and available memory—rather than cloud or enterprise tooling.
  • The content is shared via a Reddit link in the r/LocalLLaMA community, suggesting the model recommendation would likely come from community experience with locally run LLMs and coding agents.

16-inch MacBook Pro - M5 Max

Component Specs
Chip Apple M5 Max
CPU 18-core (6 super cores @ 4.6 GHz, 12 performance cores @ 4.4 GHz)
GPU 40-core (Hardware-accelerated ray tracing + Neural Accelerators)
Memory Bandwidth 614 GB/s
Neural Engine 16-core optimized for AI/ML
Unified Memory 128GB
Storage 2TB SSD
submitted by /u/UnknownEssence
[link] [comments]