Best budget local LLM for coding

Reddit r/LocalLLaMA / 3/23/2026

💬 OpinionDeveloper Stack & InfrastructureTools & Practical UsageModels & Research

Key Points

  • The post seeks affordable, locally runnable LLM options suitable for coding tasks.
  • The author plans to use the model with the Coplay Unity plugin for game development.
  • Hardware is modest (RTX 4060 Ti, 16–32 GB RAM, i9-9900) and may constrain model choices.
  • They are asking the Reddit community for recommendations on budget local LLMs.

I'm looking for a model I can run for use with the Coplay Unity plugin to work on some game projects.

I have a RTX 4060 Ti, 16GB, 32GB DDR4 RAM, and an i9-9900 CPU. Nowhere near industry level resources, but hopefully enough for something useful.

Any suggestions would be greatly appreciated.

submitted by /u/SirStarshine
[link] [comments]