AI Disruption

AI Disruption

Share this post

AI Disruption
AI Disruption
Mistral AI Launches Mistral Small 3: 24B Rivals 70B!
Copy link
Facebook
Email
Notes
More

Mistral AI Launches Mistral Small 3: 24B Rivals 70B!

Mistral Small 3 revolutionizes AI with 24B parameters, rivaling 70B giants in speed and performance. Open-source, efficient, and ready for local deployment!

Meng Li's avatar
Meng Li
Feb 03, 2025
∙ Paid
1

Share this post

AI Disruption
AI Disruption
Mistral AI Launches Mistral Small 3: 24B Rivals 70B!
Copy link
Facebook
Email
Notes
More
1
Share

"AI Disruption" publication New Year 30% discount link.


DeepSeek Rises Up: The Open-Source Large Model Battle Rekindles!

A new 24B model called Mistral Small 3 has appeared on Hugging Face’s popular leaderboard.

Mistral Small 3 challenges the 70B-parameter giants with its “small frame” of 24 billion parameters, matching the performance of Llama 3.3 70B while running over 3 times faster.

Even more impressive, it is open-sourced under the Apache 2.0 license and can run locally on a single RTX 4090 card.

In this article, we will take you on a journey to uncover how Mistral’s “small move” is reshaping the AI landscape through efficiency, and introduce you to the technological breakthroughs and application potential behind it.

Mistral Small 3 was trained without using reinforcement learning (RL) or synthetic data, positioning it earlier in the model production process than models like Deepseek R1 (an excellent and complementary open-source technology).

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Meng Li
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More