DeepSeek Drops V4: 1.6 Trillion Parameters, 1M Context Window

DeepSeek's V4 Pro is its biggest model yet, claiming parity with top closed-source rivals from OpenAI and Google DeepMind.

DeepSeek Drops V4: 1.6 Trillion Parameters, 1M Context Window

DeepSeek just unveiled its V4 lineup, and the numbers are massive. The V4 Pro model packs 1.6 trillion total parameters, making it the largest model the Chinese AI lab has ever released by that metric. Its leaner sibling, V4 Flash, comes in at 284 billion parameters.

Both models share a 1 million token context window — a substantial capability for processing lengthy documents and complex reasoning tasks.

The real headline: DeepSeek claims V4 is competitive with the best closed-source models from OpenAI and Google DeepMind. That's a bold statement from an open-weight contender that's been steadily closing the gap with Western AI giants.

No word yet on benchmark specifics or pricing, but if the claims hold up, the already-heated LLM race just got another serious entrant throwing heavyweight punches.