Curated model list
The best open-source LLMs in 2026
Open-weight models have caught up faster than most predicted. Today you can self-host a model that matches frontier coding agents on Terminal-Bench, and run it on a single 8×H100 box. Here's what to grab.
DeepSeek-V4-Pro
DeepSeek1.6T (49B active) — MIT. DeepSeek's flagship open-weight MoE.
open sourceDeepSeek-V3.1
DeepSeek671B — MIT. Large MoE open-weight model.
open sourceDeepSeek-R1
DeepSeek671B — MIT. Reasoning-focused open-weight model..
open sourceLlama 4 Maverick
Meta400B — Llama 4 Community License. Mixture-of-experts open-weight model from Meta..
open sourceDeepSeek-V4-Flash
DeepSeek284B (13B active) — MIT. Smaller, faster sibling to DeepSeek-V4-Pro.
open sourceQwen3 235B
Alibaba235B — Apache-2.0. Predecessor to the Qwen3.6 family..
open sourceMistral Large 2
Mistral123B — Mistral Research License. Self-hostable..
open sourceLlama 4 Scout
Meta109B — Llama 4 Community License. Self-hostable..
open sourceWant the rest? Browse the full model catalog, or build a side-by-side comparison.