Qualcomm enters artificial intelligence data center chip race against Nvidia and AMD

Qualcomm is pushing into the data center market with new artificial intelligence accelerators and full server systems, positioning itself against Nvidia and AMD while aiming to diversify beyond smartphones.

Qualcomm shares surged as investors reacted to a major strategic pivot into the data center market with new artificial intelligence accelerators and rack-scale servers designed to compete with Nvidia and AMD. Qualcomm (QCOM) shares soared more than 20% Monday before closing the day out up 11% after the company announced it is entering the data center market its new AI200 and AI250 chips and rack-scale server offerings. The move targets a portion of the multibillion-dollar data center opportunity and is part of a broader effort to reduce dependence on smartphone chips and licensing revenue.

The new lineup centers on the AI200 and AI250 accelerators, which are tightly integrated with Qualcomm CPUs in full server rack systems. Available beginning in 2026, the AI200 is both the name of Qualcomm’s individual AI accelerator and the full server rack it slots into, complete with a Qualcomm CPU. The AI250 is Qualcomm’s next-generation AI accelerator and server coming in 2027, and a third chip and server are scheduled for 2028, with Qualcomm committing to an annual cadence. The accelerators use Qualcomm’s custom Hexagon NPU, a neural processing unit that builds on experience from its Windows PC chips, scaled up specifically for data center workloads focused on artificial intelligence inference rather than training.

Qualcomm is emphasizing total cost of ownership as a competitive advantage, highlighting low power consumption and systems tuned for running artificial intelligence models. Total cost of ownership has become a major metric for data center builders, as they try to contain the dizzying costs associated with constructing and running their enormous server farms. The key difference between the AI200 and AI250, Qualcomm explained, is that the AI250 will offer 10x the memory bandwidth of the AI200. Customers will be able to buy individual chips, parts of the server configuration, or complete racks, and potential buyers could even include rivals such as Nvidia and AMD, creating a mix of competition and partnership. The push follows an earlier, unsuccessful data center effort around the Qualcomm Centriq 2400 platform launched in 2017 with Microsoft, and comes as hyperscalers like Amazon, Google, and Microsoft invest in their own artificial intelligence silicon. In Q3, Qualcomm reported total revenue of $10.4 billion. Of that, $6.3 billion came from its handset business. Qualcomm does not currently report data center revenue, but that could change if the new artificial intelligence server and chip portfolio gains traction in a market dominated by Nvidia and AMD.

60

Impact Score

Artificial Intelligence deepfakes show truth alone cannot restore public trust

Warnings about an Artificial Intelligence driven truth crisis focused on confusion and verification tools, but manipulated government and media imagery show that even exposed fakes still shape beliefs and erode trust. New research and weak authenticity labeling suggest transparency is necessary yet insufficient to counter deepfake influence.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.