Llama 4 Maverick

published

Meta's open-weight MoE model

Mixture-of-experts open-weight model. 400B total parameters, competitive with proprietary frontier models.

Provider meta Type llm Access open_weight Params 400B MoE Context 128k License llama-community

Why It Matters

Largest open-weight model at release. MoE architecture keeps inference cost reasonable despite parameter count.

Known Limitations

Llama Community License restricts commercial use above 700M MAU threshold. MoE complexity increases deployment difficulty.

Provider meta
Released 2025-07-01
Training cutoff 2025-06
Created March 22, 2026
Last reconciled Never