AI Disruption

AI Disruption

Share this post

AI Disruption
AI Disruption
Alibaba's 480B AI coding model beats Kimi K2

Alibaba's 480B AI coding model beats Kimi K2

Alibaba's Qwen3-Coder-480B: Open-source SOTA AI programming model (480B params) beats Kimi K2 & GPT-4.1. Free on Qwen Chat, Hugging Face. 256K-1M context, 65K output.

Meng Li's avatar
Meng Li
Jul 23, 2025
∙ Paid
7

Share this post

AI Disruption
AI Disruption
Alibaba's 480B AI coding model beats Kimi K2
6
Share

"AI Disruption" Publication 7100 Subscriptions 20% Discount Offer Link.


Qwen Released Qwen3-Coder — A 480 Billion Parameter Open-Source Coding  Model | by Jim Clyde Monge | Jul, 2025 | Generative AI

The strongest open-source programming model has changed hands.

Just now, Alibaba's Qwen team open-sourced its latest flagship programming model Qwen3-Coder-480B-A35B-Instruct.

The Qwen team claims this is their most powerful open-source agentic programming model to date, featuring 480B parameters with 35B activated parameters, native support for 256K context, and can be extended through extrapolation to 1 million context (input), with a maximum output of 65,000 tokens.

In benchmark tests, Qwen3-Coder demonstrates excellent performance in programming and agentic tasks, achieving open-source SOTA in three categories: Agentic Coding, Agentic Browser-Use, and Agentic Tool-Use, surpassing open-source models like Kimi K2 and DeepSeek V3 as well as closed-source models like GPT-4.1, and can compete with Claude Sonnet 4, a model renowned for its programming capabilities.

Qwen3-Coder will be available in multiple sizes, with this release being its most powerful variant. Its parameter count exceeds Alibaba's flagship model Qwen3's 235B (235 billion), but is smaller than Kimi K2's 1T (1 trillion).

According to Alibaba's official introduction, with Qwen3-Coder, a novice programmer can complete in one day what would take a senior programmer a week, and generating a brand website takes as little as 5 minutes.

Image

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Meng Li
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share