MiniMax Open-Sources M2, Free Limited-Time
10B-active MiniMax-M2 tops open-source charts—code, agents, free API & weights
“AI Disruption” Publication 8000 Subscriptions 20% Discount Offer Link.
10 Billion Active Parameters!
MiniMax Open Sources New Model M2, Ranking First Globally Among Open Source Models in Comprehensive Performance.
Just now, MiniMax released and open-sourced MiniMax-M2, a lightweight model built for Max visual programming and agent workflow construction.
MiniMax-M2 focuses on improving agent efficiency and is an MoE (Mixture of Experts) model with a total of 230 billion parameters, including 10 billion active parameters, balancing programming and agent tasks as well as general intelligence.
In authoritative benchmark evaluations, MiniMax-M2’s test results surpass leading models such as Gemini 2.5 Pro and DeepSeek-V3.2, approaching GPT-5 (thinking) model performance. It is claimed to provide end-to-end tool usage performance comparable to these models while being more convenient to deploy and scale.



