Could Advanced Micro Devices become the next growth story in Artificial Intelligence chips?

Advanced Micro Devices is expanding its Instinct data center lineup and winning marquee deals as it seeks a defensible niche against Nvidia in Artificial Intelligence chips. Growth has cooled amid export limits, tougher competition, and a product transition, but analysts still see significant upside.

Investors typically associate Artificial Intelligence chips with Nvidia, which dominates the discrete GPU market with a reported 94% share, leaving Advanced Micro Devices with about 6%. Yet AMD has steadily broadened its data center presence, particularly since launching its MI300 Instinct GPUs in late 2023 on TSMC’s 5-nanometer and 6-nanometer nodes. In several industry benchmarks, AMD’s MI300X surpassed Nvidia’s H100 on raw processing power and memory bandwidth, and buyers could acquire roughly four MI300-class GPUs for the price of one H100. Nvidia’s newer H200, part of its Blackwell generation, generally outperforms the MI300X for most Artificial Intelligence workloads, but it is still roughly twice as expensive. AMD also fields MI300 APUs that combine CPU and GPU on one chip, eliminating stand-alone CPUs for some deployments, while Nvidia does not make x86 CPUs or APUs.

Despite the product momentum, AMD’s data center growth has decelerated. Data center revenue rose 115% year over year in the second quarter of 2024, then slowed to 14% by the second quarter of 2025, with the segment’s contribution to total revenue sliding from about half to 42%. The company cites three headwinds: a U.S. export restriction that bars its MI308 shipments to China, tougher competition from Nvidia’s H200 and the customer lock-in created by Nvidia’s CUDA software ecosystem, and a pause in demand for current MI300X parts as customers wait for next-generation MI350 chips ramping in the second half of 2025. Even so, AMD is offsetting pressure with stronger sales of EPYC server CPUs, Ryzen PC CPUs, and Radeon PC GPUs, while continuing to gain ground on a struggling Intel in x86 CPUs.

AMD’s growing list of marquee customers underlines its progress. Oracle plans to install 50,000 AMD GPUs across its cloud infrastructure, and OpenAI aims to deploy up to 6 gigawatts of AMD CPUs over the next few years. OpenAI has also acquired stock warrants that could ultimately translate into a 10% stake in AMD. These moves suggest large customers want to diversify away from Nvidia’s platform. While AMD’s Instinct lineup may trail Blackwell in peak performance, the chips can be optimized for memory-intensive tasks, and the APU option provides additional architectural flexibility.

Looking ahead, analysts project AMD’s revenue and earnings per share to grow at compound annual rates of 30% and 86%, respectively, from 2024 to 2027. The shares trade at 57 times next year’s earnings, a premium that reflects the runway in its Artificial Intelligence portfolio. AMD is unlikely to unseat Nvidia as the market leader, but if it continues to expand a defensible niche, both companies could thrive, and AMD could indeed be the next growth story in Artificial Intelligence chips.

55

Impact Score

Sam Altman’s role in shaping Artificial Intelligence hype

Sam Altman’s sweeping promises about superintelligent systems and techno-utopia have helped define how Silicon Valley and the public imagine the future of Artificial Intelligence, often ahead of what the technology can actually prove.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.