Be it opencode, VS code copilot extension or whatever "open source" AI tool, I rarely see llama.cpp treated as a first class provider? Every single one of them has ollama and sometimes LMStudio. Engineering wise there's literally 0 effort to have llama.cpp be listed the same as ollama. Or better yet, simply make it a label agnostic openai API compatible endpoint and let me fill in the port number/enpoint.. This is especially annoying as ollama is the scummy turncoat stealing from llama.cpp that still has the mindshare despite it being clear as day that they are not good members of the OSS ecosystem. llama.cpp is now very usable for the average dev (majority of userbase currently) and reasonably so for the average joe.
I'm high key hoping that this post will reach devs who are making these tools..
[link] [comments]



