gpt.buzz
Sign in

Compare models

Pick up to 4models. Specs render side-by-side. Share the URL — it's stateless.

SelectedMistral logoMistral Large 2×Anthropic logoClaude 4.7 Opus×DeepSeek logoDeepSeek-V4-Pro×OpenAI logoo3×Clear all
 
Mistral logoMistral Large 2

Mistral

Anthropic logoClaude 4.7 Opus

Anthropic

DeepSeek logoDeepSeek-V4-Pro

DeepSeek

OpenAI logoo3

OpenAI

VendorMistralAnthropicDeepSeekOpenAI
FamilyMistralClaudeDeepSeeko-series
Release date2024-07-242026-04-162026-04-222025-04-16
Context window128,000 tokens1,000,000 tokens1,000,000 tokens200,000 tokens
Parameters123B1.6T (49B active)
Modalitytexttext, visiontexttext, vision
LicenseMistral Research LicenseproprietaryMITproprietary
Sourceopen weightsproprietaryopen weightsproprietary
DescriptionAnthropic's most capable model, optimized for complex reasoning and coding. Improved software engineering, long-running coding tasks, and higher-resolution vision over Claude 4.6.DeepSeek's flagship open-weight MoE. 1.6T parameters with 49B activated, 1M-token context, and a hybrid attention scheme (CSA + HCA) that delivers long-context inference at ~27% of V3.2's FLOPs.Reasoning-focused model in the o-series.
Links

Add a model

Max 4 models. Remove one to add another.