Mistral Large 3
publishedMistral's flagship MoE model
European frontier model. 675B MoE with 41B active. Strong coding performance with 92% HumanEval.
Provider mistral Type llm Access open_weight Params 675B MoE (41B active) Context 128k License apache-2.0
Benchmarks (2)
Why It Matters
European-developed alternative to US/Chinese frontier models. Apache 2.0 license. Strong on code (92% HumanEval).
Known Limitations
Availability varies. Less tested on agentic workflows than Claude/GPT.
Provider mistral
Released 2025-10-01
Training cutoff 2025-08
Created March 22, 2026
Last reconciled Never