Move to local models

Reddit r/LocalLLaMA / 4/17/2026

💬 OpinionDeveloper Stack & InfrastructureSignals & Early TrendsTools & Practical Usage

Key Points

  • The author shares that they use the Claude web UI for projects and recently set up a dual Tesla V100 16GB local host for local model tinkering.
  • They are using Open WebUI, but find that it does not provide downloadable files as easily as the Claude web interface does.
  • They ask whether they are missing a feature in Open WebUI, or if there is a way to add file-download functionality.
  • They also request alternative ways to work with Open WebUI to better match Claude’s file-sharing/workflow experience.
  • The post is framed as a practical workflow question from a local LLM user community rather than a formal product announcement.

Hi all I'm a big user of Claude web UI for my projects I just built a dual Tesla v100 16gb local host and I'm doing some tinkering with it using open web UI and it's nice but it doesn't give the files in a easy downloadable way like Claude web dose am I missing something is there a way to add that functionality or is there a better way to work with open web UI ?

submitted by /u/Totalkiller4
[link] [comments]