Ollama Now Officially Supports the Anthropic API!
Ollama now supports Anthropic API, letting you run local models like qwen3-coder with Claude Code. Compatible with all Anthropic tools and SDKs.
“AI Disruption” Publication 8800 Subscriptions 20% Discount Offer Link.
Ollama now officially supports Anthropic’s API interface.
This means we can now connect locally running models to Claude Code.
Currently, you need to update to Ollama v0.14.0 or higher. After downloading and updating, simply restart, and it will be compatible with the Anthropic Messages API.
And it’s not just limited to Claude Code — all tools, interfaces, and SDKs compatible with Anthropic can now be used.



