AI Disruption

AI Disruption

MiniMax Open-Sources M2, Free Limited-Time

10B-active MiniMax-M2 tops open-source charts—code, agents, free API & weights

Meng Li's avatar
Meng Li
Oct 27, 2025
∙ Paid
3
3
Share

“AI Disruption” Publication 8000 Subscriptions 20% Discount Offer Link.


Minimax M2登場!実力を自分のベンチマークで試してみた!

10 Billion Active Parameters!

MiniMax Open Sources New Model M2, Ranking First Globally Among Open Source Models in Comprehensive Performance.

Just now, MiniMax released and open-sourced MiniMax-M2, a lightweight model built for Max visual programming and agent workflow construction.

Image

MiniMax-M2 focuses on improving agent efficiency and is an MoE (Mixture of Experts) model with a total of 230 billion parameters, including 10 billion active parameters, balancing programming and agent tasks as well as general intelligence.

In authoritative benchmark evaluations, MiniMax-M2’s test results surpass leading models such as Gemini 2.5 Pro and DeepSeek-V3.2, approaching GPT-5 (thinking) model performance. It is claimed to provide end-to-end tool usage performance comparable to these models while being more convenient to deploy and scale.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Meng Li
Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture