Tech
DeepSeek V4 Arrives With Near State-of-the-Art Intelligence At 1/6th the Cost
An anonymous reader quotes a report from VentureBeat: The whale has resurfaced. DeepSeek, the Chinese AI startup offshoot of High-Flyer Capital Management quantitative analysis firm, became a near-overnight sensation globally in January 2025 with the release of its open source R1 model that matched proprietary U.S. giants. It’s been an epoch in AI since then, and while DeepSeek has released several updates to that model and its other V3 series, the international AI and business community has been largely waiting with baited breath for the follow-up to the R1 moment.
Now it’s arrived with last night’s release of DeepSeek-V4, a 1.6-trillion-parameter Mixture-of-Experts (MoE) model available free under commercially-friendly open source MIT License, which nears — and on some benchmarks, surpasses — the performance of the world’s most advanced closed-source systems at approximately 1/6th the cost over the application programming interface (API).
This release — which DeepSeek AI researcher Deli Chen described on X as a “labor of love” 484 days after the launch of V3 — is being hailed as the “second DeepSeek moment.” As Chen noted in his post, “AGI belongs to everyone”. It’s available now on AI code sharing community Hugging Face and through DeepSeek’s API. The new DeepSeek-V4-Pro model delivers “near-frontier performance” at a much lower price, costing $5.22 for 1 million input and 1 million output tokens compared with $35 for GPT-5.5 and $30 for Claude Opus 4.7. That makes it roughly 1/7th the cost of GPT-5.5 and 1/6th the cost of Claude Opus 4.7, reinforcing VentureBeat’s point that DeepSeek is “compressing advanced model economics into a much lower band.”
While GPT-5.5 and Claude Opus 4.7 still lead on most benchmarks, DeepSeek-V4-Pro gets close enough that its lower cost could “force a major rethink of the economics of advanced AI deployment.”
You must be logged in to post a comment Login