gpt.buzz
Sign in

Curated model list

The best open-source LLMs in 2026

Open-weight models have caught up faster than most predicted. Today you can self-host a model that matches frontier coding agents on Terminal-Bench, and run it on a single 8×H100 box. Here's what to grab.

01

DeepSeek-V4-Pro

DeepSeek

1.6T (49B active) — MIT. DeepSeek's flagship open-weight MoE.

open source
02

DeepSeek-V3.1

DeepSeek

671B — MIT. Large MoE open-weight model.

open source
03

DeepSeek-R1

DeepSeek

671B — MIT. Reasoning-focused open-weight model..

open source
04

Llama 4 Maverick

Meta

400B — Llama 4 Community License. Mixture-of-experts open-weight model from Meta..

open source
05

DeepSeek-V4-Flash

DeepSeek

284B (13B active) — MIT. Smaller, faster sibling to DeepSeek-V4-Pro.

open source
06

Qwen3 235B

Alibaba

235B — Apache-2.0. Predecessor to the Qwen3.6 family..

open source
07

Mistral Large 2

Mistral

123B — Mistral Research License. Self-hostable..

open source
08

Llama 4 Scout

Meta

109B — Llama 4 Community License. Self-hostable..

open source

Want the rest? Browse the full model catalog, or build a side-by-side comparison.

Other curated lists