Artificial Intelligence growth is about to hit a memory wall

Orders for Artificial Intelligence chips and infrastructure are surging across vendors, but limited memory bandwidth risks slowing deployments and leaving GPU capacity underutilized. Industry executives say technologies such as compute express link and SSD advances can help, but no single cure exists.

Orders for Artificial Intelligence gear are pouring in across the industry, not just at chip giant Nvidia. The article reports that AMD has secured deals with Meta, Microsoft, OpenAI and Oracle. Cisco already has Not stated worth of Artificial Intelligence orders this year, and Broadcom revealed it landed a Not stated contract for racks built on its XPU chips. The surge in compute and infrastructure spend is colliding with another constraint data centers are already wrestling with: power, cooling and networking demands.

Beyond those limits, memory bandwidth is emerging as a central bottleneck for scaling Artificial Intelligence. Experts quoted in the article say GPUs are often restricted by the need to connect to external memory over interconnects that slow data movement, and CPUs also face the same wall. J. Gold Associates founder Jack Gold is quoted saying that bringing memory closer and faster to GPUs yields major performance improvements. ScaleFlux VP of products JB Baker described the mismatch as a gap between how many calculations chips can perform and how much memory bandwidth exists to feed them, comparing it to a massive water source drained through a garden hose.

There is no single cure, but steps are underway. The article notes compute express link, solid state drive advances and vendor efforts such as Nvidia´s NVLink and Storage-Next initiatives as partial solutions. It also names other players, including Intel spin-off Cornelis, working on speeding memory access. Baker argued that on-premises adopters must rebalance capital expenditures to invest in memory, storage and networking as well as GPUs, warning that underinvesting in those areas will waste GPU spend and burn power with idle processors.

72

Impact Score

Ajinomoto’s quiet grip on a material powering Artificial Intelligence chips

Japanese food giant Ajinomoto has become a critical chokepoint in the semiconductor supply chain by controlling nearly all production of a specialized insulating film used in advanced Artificial Intelligence processors. Its Ajinomoto Build-up Film underpins high performance Nvidia-style chips and is extremely difficult for rivals to replicate.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.