gpt.buzz
Sign in

Compare models

Pick up to 4models. Specs render side-by-side. Share the URL — it's stateless.

SelectedAnthropic logoClaude 4.7 Opus×DeepSeek logoDeepSeek-R1×DeepSeek logoDeepSeek-V4-Pro×Mistral logoMistral Large 2×Clear all
 
Anthropic logoClaude 4.7 Opus

Anthropic

DeepSeek logoDeepSeek-R1

DeepSeek

DeepSeek logoDeepSeek-V4-Pro

DeepSeek

Mistral logoMistral Large 2

Mistral

VendorAnthropicDeepSeekDeepSeekMistral
FamilyClaudeDeepSeekDeepSeekMistral
Release date2026-04-162025-01-202026-04-222024-07-24
Context window1,000,000 tokens128,000 tokens1,000,000 tokens128,000 tokens
Parameters671B1.6T (49B active)123B
Modalitytext, visiontexttexttext
LicenseproprietaryMITMITMistral Research License
Sourceproprietaryopen weightsopen weightsopen weights
DescriptionAnthropic's most capable model, optimized for complex reasoning and coding. Improved software engineering, long-running coding tasks, and higher-resolution vision over Claude 4.6.Reasoning-focused open-weight model.DeepSeek's flagship open-weight MoE. 1.6T parameters with 49B activated, 1M-token context, and a hybrid attention scheme (CSA + HCA) that delivers long-context inference at ~27% of V3.2's FLOPs.
Links

Add a model

Max 4 models. Remove one to add another.