I have snobbishly long felt that the local models were not 'up to my standards' for local development, or otherwise able to compete with GHCP, Claude Code, Cursor etc.
Boy was I wrong. With the rapid increase of usage constraints and enshittification of plans all the cloud providers are starting to enact, I finally downloaded Opencode and got it setup with llama-server + Qwen3.6-27B at a reasonable quant (Q5_K_P) with 128K context (unsure if I could push this more but it's plenty for the time being). Currently serving with 1x5090 off a dedicated linux box downstairs. It is immensely freeing to not have to think about usage limits, about my code and prompts being analyzed by some arbitrary review process to decide if I get to keep my account or not, and so on.
Is it perfect? No, I've had to halt it once in a while due to loops. But overall... this feels like the future to me. Honestly still feels a bit crazy that I'm chatting with a piece of metal in my house, but here we are.
Anyway, I suppose for this particular subreddit this is probably not a huge surprise. But then again, I have frequented it a lot and was skeptical... so I just wanted to share because if you've been on the fence about trying it, I think it's to that point now where its very worthwhile indeed, especially if you are wanting to dev some things that cloud providers might take account action against (security research, scraping, etc)
[link] [comments]

