1.6 Trillion Parameters - How DeepSeek V4 is Redefining Open Source AI

The Chinese AI company DeepSeek just released the largest open-weights model, DeepSeek V4. But can it outperform the models of OpenAI, Anthropic, and Google?

An illustration with the writing Open Source AI DeepSeek V4
Image: Generated with AI

The details

  • DeepSeek V4 comes in two versions: DeepSeek-V4-Pro (1.6T total / 49B active parameters) and DeepSeek-V4-Flash (284B total / 13B active parameters).
  • Both variants are mixture-of-experts models and have a context window of 1M tokens.
  • DeepSeek V4 has a new architecture that significantly reduces computing requirements for long texts. For more technical information, check out the paper.
  • DeepSeek can offer its models via API at significantly lower prices than OpenAI, Anthropic, or Google. The models of DeepSeek V4 are available on Hugging Face.

Our thoughts

DeepSeek-V4-Pro is currently the largest open-weights model available. However, our online research shows that it can’t compete with the top models from OpenAI, Anthropic, and Google.

Like OpenAI and Anthropic, DeepSeek primarily targets developers. These days, you often see coding benchmarks in publications.

More information: 🔗 DeepSeek Docs | DeepSeek Tech Report

Magic AI tools

AI Tools Recommendations

Boost Your Productivity with AI

Explore the best AI tools to boost your efficiency and productivity.

Learn more

AI Tutorials for Devs


😀 Do you enjoy our content? If so, why not support us with a small financial contribution? As a supporter, you can comment on newsletter editions (e-mail version) and read our website without banner ads.