Meta expands Nvidia partnership with multigenerational chip deal

Meta is deepening its reliance on Nvidia through a multigenerational deal to build data centers powered by the chipmaker’s GPUs, CPUs, and networking gear, tightening competition with Intel, AMD, and Google in artificial intelligence infrastructure.

Meta is significantly expanding its partnership with Nvidia in a multigenerational agreement that will see the social networking company build data centers powered by millions of Nvidia’s current and next-generation chips for artificial intelligence training and inference. The deal reinforces Nvidia’s position at the center of Meta’s artificial intelligence strategy, even as Meta continues to design its own custom chips and maintain relationships with alternative suppliers such as AMD. Meta has also reportedly explored using TPUs designed by rival Google, but analysts suggest the new agreement could dampen speculation about those potential deployments, while acknowledging that large technology companies often evaluate multiple vendors in parallel.

The expanded relationship comes amid intensifying competition in artificial intelligence infrastructure, where Nvidia retains a leading share but faces growing pressure from Google, AMD, and Broadcom. A crucial element of the partnership is Meta’s decision to deploy not only Nvidia’s GPUs but also its CPUs inside new data centers. CPUs, historically dominated by Intel and AMD, handle general purpose computing tasks and work alongside GPUs, which are optimized for high performance workloads such as artificial intelligence training and advanced graphics. By supplying both processor types, Nvidia stands to capture a larger share of Meta’s infrastructure spending and embed itself more deeply in Meta’s artificial intelligence technology stack, even as soaring demand means rivals are unlikely to see immediate declines in business.

Analysts note that Nvidia has become more explicit about its CPU ambitions, including marketing its forthcoming Vera CPU as a stand-alone product and positioning CPUs as increasingly important as artificial intelligence workloads shift from model training to inference. One analyst said that “CPUs tend to be cheaper and a bit more power-efficient for inference,” highlighting why Meta’s choice of a single vendor for both GPUs and CPUs may appeal to chief information officers who prefer a “one-throat-to-choke” approach to support and problem resolution. Beyond processors, Meta will adopt Nvidia’s networking equipment inside its data centers and use Nvidia’s confidential computing technology to power artificial intelligence features within WhatsApp. Nvidia said the companies will also collaborate on deploying its next-generation Vera CPUs beyond the current Grace CPU model, further extending the scope of the partnership.

68

Impact Score

European Parliament suspends Artificial Intelligence tools over data transfer risks

The European Parliament has temporarily disabled Artificial Intelligence features on official devices over fears that sensitive data could be routed to external cloud servers and foreign jurisdictions. The move highlights growing tension between data protection priorities and the widespread adoption of cloud based Artificial Intelligence services.

Artificial intelligence and systemic risk in finance

Artificial intelligence is transforming financial markets with productivity and risk management gains, but its core features may also magnify systemic vulnerabilities and outpace regulators’ oversight.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.