gpt.buzz
Sign in

Compare models

Pick up to 4models. Specs render side-by-side. Share the URL — it's stateless.

SelectedAnthropic logoClaude 4.7 Opus×OpenAI logoo3×DeepSeek logoDeepSeek-V4-Flash×DeepSeek logoDeepSeek-V4-Pro×Clear all
 
Anthropic logoClaude 4.7 Opus

Anthropic

OpenAI logoo3

OpenAI

DeepSeek logoDeepSeek-V4-Flash

DeepSeek

DeepSeek logoDeepSeek-V4-Pro

DeepSeek

VendorAnthropicOpenAIDeepSeekDeepSeek
FamilyClaudeo-seriesDeepSeekDeepSeek
Release date2026-04-162025-04-162026-04-222026-04-22
Context window1,000,000 tokens200,000 tokens1,000,000 tokens1,000,000 tokens
Parameters284B (13B active)1.6T (49B active)
Modalitytext, visiontext, visiontexttext
LicenseproprietaryproprietaryMITMIT
Sourceproprietaryproprietaryopen weightsopen weights
DescriptionAnthropic's most capable model, optimized for complex reasoning and coding. Improved software engineering, long-running coding tasks, and higher-resolution vision over Claude 4.6.Reasoning-focused model in the o-series.Smaller, faster sibling to DeepSeek-V4-Pro. Same 1M context window with a much lighter 284B / 13B-active MoE.DeepSeek's flagship open-weight MoE. 1.6T parameters with 49B activated, 1M-token context, and a hybrid attention scheme (CSA + HCA) that delivers long-context inference at ~27% of V3.2's FLOPs.
Links

Add a model

Max 4 models. Remove one to add another.