DeepSeek Issued Its Open Source Model V4 Preview, with a Cost-Effective 1M Context Length
April 24, 2026

IBL News | New York
Chinese AI startup DeepSeek launched two preview versions of its newest large language model yesterday: DeepSeek V4 Flash and V4 Pro, with context windows of 1 million tokens each — enough to include large codebases or documents in prompts.
DeepSeek V4 Pro costs $1.74/1M input and $3.48/1M output tokens while V4 Flash costs $0.14/1M input and $0.28/1M output tokens, both the cheapest in their class.
The company stated that it has almost “closed the gap” with current leading models, both open and closed, on reasoning benchmarks.
The company claims its new V4-Pro-Max model outperforms its open-source peers across reasoning benchmarks, and outstrips OpenAI’s GPT-5.2 and Gemini 3.0 Pro on some tasks. In coding competition benchmarks, DeepSeek said both V4 models’ performance is “comparable to GPT-5.4.”
The Pro model has a total of 1.6 trillion parameters (49 billion active), making it the largest open-weight model available, outstripping Moonshot AI’s Kimi K 2.6 (1.1 trillion), MiniMax’s M1 (456 billion), and more than double DeepSeek V3.2 (671 billion). The smaller, V4 Flash has 284 billion parameters (13 billion active).
Notably, DeepSeek V4 is much more affordable than any frontier model available today.
• The smaller V4 Flash model costs $0.14 per million input tokens and $0.28 per million output tokens, undercutting GPT-5.4 Nano, Gemini 3.1 Flash, GPT-5.4 Mini, and Claude Haiku 4.5.
• The larger V4 Pro model, meanwhile, costs $0.145 per million input tokens and $3.48 per million output tokens, also undercutting Gemini 3.1 Pro, GPT-5.5, Claude Opus 4.7, and GPT-5.4.
DeepSeek said both models are more efficient and performant than DeepSeek V3.2 due to architectural improvements.
Both V4 Flash and V4 Pro support text only, unlike many of their closed-source peers, which support understanding and generating audio, video, and images.
The launch of DeepSeek V4 Flash and V4 Pro comes a day after the U.S. accused China of stealing American AI companies’ IP on an industrial scale using thousands of proxy accounts. DeepSeek itself has been accused by Anthropic and OpenAI of “distilling,” essentially copying, their AI models.
🚀 DeepSeek-V4 Preview is officially live & open-sourced! Welcome to the era of cost-effective 1M context length.
🔹 DeepSeek-V4-Pro: 1.6T total / 49B active params. Performance rivaling the world's top closed-source models.
🔹 DeepSeek-V4-Flash: 284B total / 13B active params.… pic.twitter.com/n1AgwMIymu— DeepSeek (@deepseek_ai) April 24, 2026
Discover more
IBL News is funded by the New York-based, family-owned company ibl.ai. Our stories adhere to the highest ethical standards in journalism and are available to news syndication agencies.








