gpt.buzz
Sign in

Compare models

Pick up to 4models. Specs render side-by-side. Share the URL — it's stateless.

SelectedOpenAI logoo3×Anthropic logoClaude 4.6 Sonnet×DeepSeek logoDeepSeek-V3.1×DeepSeek logoDeepSeek-V4-Pro×Clear all
 
OpenAI logoo3

OpenAI

Anthropic logoClaude 4.6 Sonnet

Anthropic

DeepSeek logoDeepSeek-V3.1

DeepSeek

DeepSeek logoDeepSeek-V4-Pro

DeepSeek

VendorOpenAIAnthropicDeepSeekDeepSeek
Familyo-seriesClaudeDeepSeekDeepSeek
Release date2025-04-162025-08-212026-04-22
Context window200,000 tokens200,000 tokens128,000 tokens1,000,000 tokens
Parameters671B1.6T (49B active)
Modalitytext, visiontext, visiontexttext
LicenseproprietaryproprietaryMITMIT
Sourceproprietaryproprietaryopen weightsopen weights
DescriptionReasoning-focused model in the o-series.Large MoE open-weight model. Predecessor to DeepSeek-V4.DeepSeek's flagship open-weight MoE. 1.6T parameters with 49B activated, 1M-token context, and a hybrid attention scheme (CSA + HCA) that delivers long-context inference at ~27% of V3.2's FLOPs.
Links

Add a model

Max 4 models. Remove one to add another.