AI Disruption

AI Disruption

Mistral AI Launches Mistral 3 Line, Back to Apache 2.0

Mistral 3 open models land: Apache 2.0, 675B MoE & 3-14B edge variants ready to deploy

Meng Li's avatar
Meng Li
Dec 03, 2025
∙ Paid

“AI Disruption” Publication 8400 Subscriptions 20% Discount Offer Link.


Introducing the Mistral 3 family of models: Frontier intelligence at all  sizes. Apache 2.0. Details in 🧵

Just now, Mistral AI has released its new generation of open models - the Mistral 3 series.

The series includes multiple models, specifically:

• “The world’s best small models”: Ministral 3 (14B, 8B, 3B), with each model released in base, instruction-tuned, and reasoning versions.

• A frontier-level open-source MoE: Mistral Large 3, with 675B total parameters and 41B active parameters.

Mistral stated: “All models are released under the Apache 2.0 license. Open-sourcing our models in multiple compressed formats empowers the developer community and puts AI in people’s hands through distributed intelligence.”

The company also claims: “Ministral models represent the best cost-performance ratio in their class. Meanwhile, Mistral Large 3 has joined the ranks of frontier instruction-tuned open-source models.”

The series attracted countless eyes upon release, with some saying it marks Europe’s return to the AI race dominated by China and the US.

However, Mistral’s approach to benchmark presentation has also raised questions from some developers: Multimodal performance is very poor.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Meng Li · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture