MiniMax AI Launches MiniMax-M1: An Open-Source 456B Parameter Language Model with 1 Million Token Context

June 17, 2025Artificial Intelligence
MiniMax AI Launches MiniMax-M1: An Open-Source 456B Parameter Language Model with 1 Million Token Context

On June 17, Chinese AI powerhouse MiniMax AI unveiled MiniMax-M1, the world’s first open-weight, large-scale hybrid-attention language model. Featuring a record-breaking 456 billion parameters and an industry-leading 1 million token context window, MiniMax-M1 sets a new standard for handling extremely long and complex inputs. The model leverages an advanced hybrid Mixture-of-Experts (MoE) architecture combined with a “lightning attention” mechanism and an innovative reinforcement learning variant called CISPO, enabling it to process long sequences with only 25% of the compute required by previous competitors for 100,000 token generations.


Designed to boost productivity in complex real-world scenarios, MiniMax-M1 is completely open-source under the Apache 2.0 license, allowing businesses and developers to freely use and customize it for commercial applications. The model is offered in two “thinking budget” variants—40k and 80k tokens—excelling at mathematics, software engineering, coding, and extended-context reasoning tasks. Benchmark results show MiniMax-M1 consistently outperforms open-weight alternatives like DeepSeek R1 and Qwen3-235B in coding, long-context, and tool usage challenges.


In addition to MiniMax-M1, MiniMax AI announced a new video AI model—Hailuo 02 (0616)—positioned as a competitor to ByteDance’s Seedance, as well as Moonshot AI’s Kimi-Dev-72B, a coding model that surpasses DeepSeek R1. The company offers secure API services, a chatbot platform, and comprehensive technical documentation for rapid developer integration, all available on HuggingFace and GitHub.


As a leader in Asia’s AI scene, MiniMax aims to usher in a new era of advanced AI development where intelligence benefits everyone, placing MiniMax-M1 at the forefront of industry innovation.

📬 Subscribe to Our Newsletter

Stay updated with our latest blog posts and updates.

English

*You can unsubscribe at any time.