Experts-Volunteers needed for Vulkan on ik_llama.cpp

Reddit r/LocalLLaMA / 4/26/2026

💬 OpinionDeveloper Stack & InfrastructureSignals & Early Trends

Key Points

  • The maintainer of ik_llama.cpp says they no longer have the bandwidth to continue maintaining the Vulkan back-end, even though it was brought up to speed previously.
  • They are asking for expert volunteers to help resurrect and maintain Vulkan support, noting that llama.cpp has dedicated maintainers for Vulkan.
  • A key technical goal is to implement graph-parallel features in the Vulkan back-end after porting missing operations accumulated since the maintainer’s last effort.
  • The post cautions that adding and maintaining Vulkan is difficult for beginners and could become risky without sufficient Vulkan expertise, including potential failures discovered only through AI assistance.
  • Community links point to an existing pull request and discussion threads where help and contributions are being solicited.

ik_llama.cpp is great for both CPU & CUDA. Need legends to make Vulkan better as well.

https://github.com/ikawrakow/ik_llama.cpp/discussions/590#discussioncomment-16357564

So, after bringing the Vulkan back-end up to speed some time ago, I felt that I simply don't have the bandwidth to also maintain it. In llama.cpp there are two maintainers who do nothing else but Vulkan.
But if you are willing to do that, we can try to resurrect Vulkan. Of particular interest would be to implement the graph parallel stuff in the Vulkan back-end (after porting quite a few missing ops that have accumulated since my last effort).
I guess, the issue will be that I'm a complete beginner when it comes to Vulkan. So, unlike your CPU changes prepared with the help of Claude where I was able to quickly spot a problem, with Vulkan we will be left at Claude's mercy, which may turn into a complete disaster with time. So, I think, if you want to become a Vulkan maintainer for ik_llama.cpp, you need to become significantly more knowledgable than me.

https://github.com/ikawrakow/ik_llama.cpp/pull/608

https://github.com/ikawrakow/ik_llama.cpp/discussions/562

Thanks in advance!

submitted by /u/pmttyji
[link] [comments]