DeepSeek-v4 raises pressure in the global Artificial Intelligence model race

DeepSeek’s new open source V4 models combine long context windows, low pricing and growing support for Huawei chips. The release sharpens competition on cost and highlights China’s push for a more sovereign Artificial Intelligence stack.

DeepSeek has released DeepSeek-V4 in preview on April 24, expanding its push into open source Artificial Intelligence with a model family aimed at long-horizon reasoning, coding and agentic workflows. The launch is the company’s most significant model release since R1 in January 2025. DeepSeek-V4 comes in two versions: V4-Pro and V4-Flash. V4-Pro is larger with 1.6 trillion total parameters, and V4-Flash is a high-speed version of the model with 284 billion parameters. The model versions are efficient in long-context scenarios, supporting context lengths of up to 1 million tokens. That puts the V4 line in competition with long-context offerings from Google Gemini and Anthropic Claude.

Pricing is central to the model’s appeal for enterprises weighing performance against budget. V4-Pro costs ?.74 for input per million tokens and ?.48 for output per million tokens. V4-Flash costs ?.14 for input and ?.28 for output. Compared with Gemini 3.1 Pro, Gemini 3.1 Pro costs ? for input and ? for output. GPT 5.5 costs ? for input and ? for output, and Claude Opus 4.7 is ? for input and ? for output. Analysts cited in the report said the lower token pricing could materially shift enterprise buying behavior, especially for use cases where the absolute top model is less important than predictable cost and acceptable performance.

The release also marks a shift in DeepSeek’s hardware strategy. The V4 series is optimized for inference on Huawei’s Ascend supernode, reducing dependence on Nvidia. Earlier DeepSeek models such as V3 were trained on Nvidia H800 chips, but V4-Flash is reported to be partially trained on Huawei hardware, while V4-Pro still relied on Nvidia due to its massive compute needs. Analysts said the DeepSeek and Huawei relationship strengthens both companies and could help China gain credibility with other vendors, even as Huawei remains behind Nvidia in chip fabrication and software development.

Open source remains a core differentiator for DeepSeek at a time when some U.S. vendors have moved away from that model. Analysts said open source can help attract developers, build trust and expand an ecosystem, while also giving China a path into price-sensitive markets outside the U.S. At the same time, adoption in Western markets remains constrained by government scrutiny and enterprise caution, particularly in critical industries. Even so, DeepSeek’s combination of open source, lower prices and reduced reliance on Western chips is emerging as an important competitive force in the broader global Artificial Intelligence race.

74

Impact Score

Samsung strike threat raises chip supply risks

A possible labor strike at Samsung Electronics in South Korea is raising concerns about chip production disruptions, client defections, and pressure on its position in the global semiconductor race. The dispute centers on bonus rules, but the larger risk is damage to Samsung’s credibility as a reliable supplier for major tech customers.

Microsoft previews Shader Model 6.10 for gpu Artificial Intelligence engines

Microsoft has introduced Shader Model 6.10 in AgilitySDK 1.720-preview with a new matrix API designed to unify access to dedicated gpu Artificial Intelligence hardware from AMD, Intel, and NVIDIA. The change is aimed at making neural rendering features easier to deploy across multiple vendors with a single programming model.

Europe’s Artificial Intelligence challenge is structural dependence

Europe has talent, research strength, and rising investment in Artificial Intelligence, but startups remain reliant on American infrastructure, platforms, and late-stage capital. The argument centers on digital sovereignty, interoperability, and ownership as the conditions for building durable European champions.

Community backlash slows Artificial Intelligence data center expansion

Political resistance, regulatory scrutiny, and rising energy and water concerns are complicating the build-out of large Artificial Intelligence data centers across the United States. The pressure is increasing costs, delaying projects, and adding fresh risks to the economics behind Generative Artificial Intelligence infrastructure.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.