AI Disruption

AI Disruption

Ollama Now Officially Supports the Anthropic API!

Ollama now supports Anthropic API, letting you run local models like qwen3-coder with Claude Code. Compatible with all Anthropic tools and SDKs.

Meng Li's avatar
Meng Li
Feb 11, 2026
∙ Paid

“AI Disruption” Publication 8800 Subscriptions 20% Discount Offer Link.


Run Claude Code Locally with OLLAMA - No Subscription | API Key Required |  100% FREE

Ollama now officially supports Anthropic’s API interface.

This means we can now connect locally running models to Claude Code.

Currently, you need to update to Ollama v0.14.0 or higher. After downloading and updating, simply restart, and it will be compatible with the Anthropic Messages API.

And it’s not just limited to Claude Code — all tools, interfaces, and SDKs compatible with Anthropic can now be used.

User's avatar

Continue reading this post for free, courtesy of Meng Li.

Or purchase a paid subscription.
© 2026 Meng Li · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture