v0.18.3

Ollama Releases / 3/26/2026

📰 NewsDeveloper Stack & InfrastructureTools & Practical Usage

Key Points

  • Ollama v0.18.3 adds a new capability to launch models directly from VS Code using the `ollama launch vscode` command.
  • The release includes documentation updates for Claude Code/Openclaw to support web search and non-interactive operation.
  • It optimizes the launch command by skipping redundant configuration writes when the selected model hasn’t changed.
  • It improves ML performance by sharing the KV cache across conversations that share common prefixes (via the `mlxrunner` component).
  • The desktop app is fixed to avoid getting stuck loading when `OLLAMA_HOST` is set to an unspecified bind address.

Ollama launch VS Code

Ollama can now directly launch models in VS Code.

ollama launch vscode # or code

What's Changed

Full Changelog: v0.18.2...v0.18.3