I have setup Mistral-nemo with ollama, docker, OpenWebUI and Tavily, but im having an issue when i send a new message the model has no previous context and answers it as if it was a new chat
[link] [comments]
Reddit r/LocalLLaMA / 3/26/2026
I have setup Mistral-nemo with ollama, docker, OpenWebUI and Tavily, but im having an issue when i send a new message the model has no previous context and answers it as if it was a new chat
Dev.to
Dev.to
Dev.to

Dev.to
The Register